Patients who undergo plastic surgery have a variety of different reasons why they go through with the surgery. They may have broken bones and need to have their parts reconfigured. They may have been burned and need skin grafts. They could have malformations or Dysphormia. They may just want to improve the looks of some of their body parts. Regardless of the reason, plastic surgery has been shown to improve the quality of life for people who are lucky enough to have it done.
One study showed that plastic surgery not only improved the patient’s quality of life, but it also showed to reduce anxiety over Dysmorphia. “Based on baseline measures and responses at 3, 6 and 12 months post-surgery or post-consultation, the researchers found that those who completed surgery showed improvements in appearance satisfaction, goal attainment, well-being, life quality, self-perceived attractiveness and decreased anxiety and dysmorphia, compared to the non-surgery group.” Sometimes, these surgeries even increase the perceived significance of looks during consideration for job promotions or career changes.
Plastic surgery goes back as far as 600 BC, when a surgeon reconfigured a nose with cheek tissue. When the 19th century hit and anesthesia and the practice of using antiseptics to reduce the development of diseases came about, massive improvements in plastic surgery happened. Then, World War II hit, and many plastic surgeons had to hone their skill sets to be much more detail-oriented. This lead to the first cosmetic rhinoplasty in 1923 and the first face lift in 1931.
In this day and age, plastic surgery has come to include so much more than just cosmetic surgery, which in and of itself encompasses a variety of procedures, including but not limited to, “Botox treatments,… chemical peels,… microdermabrasions,… laser hair removals,… vein sclerotherapies (strippings),… liposuctions,… rhinoplasties (nose jobs),… breast augmentations,… blepharoplasties (eyelid reconstructions),… abdominoplasties,… breast reductions,… vaginal rejuvenation procedures,… calf augmentations, and… pectoral implants.” Plastic surgeries, for example, encompass working with children who have congenital anomalies and disorders who need a reconstructive surgery to function properly and have a more “normal” look. Surgeries such like these provide true hope for people with a need for reconstructive surgery. It is a blessing to their lives that allows them to move forward with less health problems and a brighter future.
As another example, people with cleft palates have to have surgery within the first year of life to correct their ailment. Correcting these cleft palates can provide children with the ability to speak properly, eat and digest food normally, and breath without blockage. These are extremely important surgeries for the wellbeing of children moving forward in their lives. This will reduce the likelihood that they feel shame over their appearances and help them to function properly.
Plastic surgery can be life saving. It can even be fun. Plastic surgery can improve the quality of life for those who receive it.
Dementia is a term most commonly associated with the elderly, but there is evidence that the incidence of this disease is on the decline. The reasons for this decline are not yet clear; research is still being performed to better understand potential risk factors for this ailment, but existing research has allowed health care professionals to understand what causes it, as well as prospective ways to help reduce, and in some cases, treat symptoms.
Dementia is a general term for a wider array of symptoms. The basic idea is that certain aspects of cognitive functions become impaired to the point where these impairments interfere with daily aspects of life. The cause of these impairments is that cells in the brain become damaged, hindering whatever part of the brain those cells influenced. This damage can be caused by various factors, and while the damage done to the cells is irreversible, it is sometimes possible to treat the underlying causes of damage, thus stopping or slowing the worsening of symptoms. (For more detailed information, follow this link. http://www.alz.org/what-is-dementia.asp)
The occurrence of these symptoms in the elderly was once attributed to the idea that these symptoms were merely a part of the aging process. However, as the underlying cause of these symptoms is actually damaged brain cells, and there are many elder people who do not have these symptoms, as measurements show that the incidence of symptoms has decreased (follow this link to see for yourself. http://jamanetwork.com/journals/jamainternalmedicine/article-abstract/2587084), it is fair to say that this was a misconception. It is the current view of health care professionals that, while age can be a risk factor, it is not the only one to consider, nor is it the sole predictor for the occurrence of symptoms.
Due to the variant nature of symptoms associated with this disease and the underlying causes of damage leading to symptoms, there are different types of dementia, and because there are different types, there are various ways to try to treat or prevent symptoms. The advice offered by healthcare professionals to decrease risks of developing certain symptoms is fairly similar to advice offered to anybody who wants to live a healthier lifestyle. Do not ingest alcohol or smoke; smoking and drinking alcohol can damage blood cells, and anything that can damage blood cells can damage brain cells. Try to get a good amount of exercise; this can increase oxygen levels and help maintain healthy blood circulation, both of which are good for the brain. Lastly, eat right; a heart healthy diet will help the heart to continue regulating blood circulation and oxygen levels, which is one of the most important aspects of a healthy brain.
The important thing to take away from all of this is that if people take the time and initiative to care for their brain, body, and mind, they will have those things much longer than they will if they don’t.
Is sugar the culprit behind the obesity epidemic in America? Maybe it’s not solely responsible, but a glance towards the past reveals that the collective approach to diet— and where we assign blame for chronic ailments like heart disease and diabetes— has been wrong for decades.
A new paper published in JAMA provides a historical context for why so many people consider dietary fat to be the cause of weight issues and accompanying ailments. In 1967, the Sugar Association (then known as the Sugar Research Foundation) paid three Harvard researchers a hefty sum to publish a paper on sugar, fat, and heart disease. Predictably, the Sugar Association decided which studies would be used in the review— the result being a body of work that exonerated sugar as a contributing factor for heart disease. Instead, blame was placed on fat, and the low-fat craze took off. Breakfast staples like bacon and eggs were vilified, and consumers went out of their way to purchase “low-fat” varieties of familiar brands.
There was only one problem. On its own, low-fat products don’t taste that great. To counter this, sugar was added to many of these foods. So while a consumer thinks he’s eating a healthier alternative, there’s just as much sugar present as in the regular fat version— if not more. Today though, we know better. A patient with diabetes, for instance, will benefit most from eliminating refined carbohydrates and refined sugars from their diet— as opposed to maintaining a carb-heavy intake but lowering dietary fats.
But if the science is backing up the idea of sugar playing a huge role in the identity crisis, then how did we get here in the first place? The answer is relatively simple. That review funded by the Sugar Association went on to be published in the New England Journal of Medicine, a prestigious academic source. As you may be able to guess, papers published in journals of such nature are the ones that drive scientific discussion. It’s an exposure issue. If the scientific community is buzzing around a handful of trusted sources, and one spouts information like that provided by the Sugar Association, the findings will spread like wildfire.
Today, sugar dominates the American diet. One University of North Carolina study found that 60% of foods purchased in grocery stores contain added sugar. There are the obvious items, like confectionary goods and pastries— but added sugar also lurks in supposed healthy foods like pastas or cereals.
The scientific community has known for quite some time about the real role of fats and carbohydrates in the diet, but public perception is still rooted in the past. Perhaps this paper can help change that.
The Zika virus has found its way to the continental United States. Florida is experiencing its own outbreak at the moment, although it is a bit too early to tell how rapidly or widely it will spread. According to NPR, there are only 37 cases reported as of late August. However, that number alone isn’t enough to tell how severe the situation really is. Many people who are carrying the Zika virus don’t actually know they have it— 80% of those infected are asymptomatic. In fact, according to computer scientist Alessandro Vespignani, the estimated detection rate of Zika hovers around just 5 percent.
But because of the confirmed cases, the CDC has issued a travel warning. Pregnant women, or those trying to get pregnant, are advised to avoid the Miami-Dade area. And all people that do make the trip, are encouraged to cover up— which is something of an issue when most tourists choose Miami because it’s a beach town. The number of infected is projected to rise, though, despite cooler temperatures tempering mosquito populations. The reason? College! Thousands of students will converge for classes again, swelling the population— and available sources of nutrients for female mosquitos.
While not known to be deadly, it probably most known for its connection with microcephaly, a birth defect that results in a smaller than usual head and severe neurological impairment. Most recently, scientists have found that there may be neurological consequences for infected adults, too.
We’re still trying to find a long term solution to eliminate Zika. One interesting method, profiled in a feature for Wired, involves releasing hundreds of thousands of male mosquitoes infected with the Wolbachia bacteria into Zika hotspots. While fighting mosquitoes with more mosquitoes seems counterintuitive, it could work. Once the infected males and the Zika carrying females mate, the bacteria prevents the eggs from hatching. It’s currently being explored in Clovis, California. Perhaps this method will spread elsewhere.
From kissing to touching our phones, there are many everyday practices that result in the exchange of germs. The go-to answer is of course to wash our hands with soap and water, but how well does soap really work? To figure this out, we need to learn how soap works.
First, let’s define a germ. A germ is a microscopic particle or organism that can make us sick. This includes bacteria and viruses. Because of the oils on our skin, most of the substances that we want to wash off of our hands, be they germs or dirt, adhere to our skin. The associated germs can be removed using a solvent such as alcohol or kerosene.
Soaps used in hospitals are often strong and alcohol-based. However, alcohol and kerosene are, to varying degrees, toxic on their own. This makes them not ideal for frequent in-home use. Not to mention smelling like kerosene all day would be displeasing. Soap is a solution that allows us not to use these substances.
Soap works in a more mechanical way, rather than dissolving oil. It essentially works to convince the oil particles and their associated germs to link up with water molecules. This allows the germs to be more readily washed away.
In order to picture this, we need to think about what a soap molecule looks like. A soap molecule is structured like a long chain of carbon and hydrogen molecules. One end of the chain is called the hydrophilic end, and it readily joins up with water. The other side is called the hydrophobic end, and it avoids water. Instead of attaching to water, the hydrophobic end readily attaches to grease. When you wash your hands, the hydrophobic end attaches to germ-hosting oil particles while the hydrophilic end coaxes these particles toward the water to be washed away. The hydrophobic end also goes between water molecules in order to remove itself from water. This forces the molecules apart and thus leads to reduced surface tensions.
If you want to see an example of how soap reduces surface tension, fill a glass of water almost to point of spilling. Surface tension is what makes the water stay in the glass. If you tap the top with a soapy finger or put a drop of soap in the glass, the water will overflow due to a decrease in surface tension.
Now that we know how soap works, let’s get to the bottom of this. Does soap actually kill 99.9% of germs as people claim it does? Many companies make this claim, but it is far from true. The statistic of 99.9% germ removal is based on laboratory results. These tests are conducted in conditions very different from the messy conditions of our everyday lives. Additionally, the claim of 99.9% could only refer to more common germs, and the tests involve a thorough scrubbing. Additionally, soap doesn’t kill germs; it just removes them.
Even in the best case scenario in which you remove 99.9% of germs every time you wash, 0.1% of germs can still be a lot of germs depending on how many germs were already on your hands or on the surface you cleaned.
Using soap and water is certainly better than not washing your hands, but soap isn’t nearly as effective at removing germs as people state. This is especially important in healthcare facilities, where germs can be transferred to patients and throughout the facilities even after washing hands. Hand-washing is a crucial practice. Hopefully, we can find a more effective alternative to soap in the future.
Click here to the listen to the full “Everyday Einstein: Quick and Dirty Tips” podcast.
Arrhythmia is a heart condition that causes electrical impulses in the heart to occur erratically. This condition can lead to sudden death. In order save the lives of people who are at risk for life-threatening arrhythmia, doctors typically implant a small defibrillator. This device can sense the onset of arrhythmia and jolt the heart back to normal rhythm. However, the defibrillator has its own health risks. There is a lot of debate circulating around the methods doctors use to decide which patients truly need this invasive and costly implant.
This debate has been going on for some time, but an interdisciplinary John Hopkins University team is working toward an answer. The team has developed a 3D virtual heart assessment tool that is not invasive. This tool will help doctors determine whether a patient faces the high risk of life-threatening arrhythmia, and therefore whether a patient needs the defibrillator implant. According to a proof-of-concept study published in the online journal Nature Communications, this new digital approach yielded more accurate predictions than the method currently used, which is an imprecise blood pumping measurement.
For his study, Natalia Trayanova, the university’s inaugural Murray B. Sachs Professor of Biomedical Engineering, joined forces with cardiologist and co-author Katherine C. Wu. Wu is an associate professor in the Johns Hopkins School of Medicine, and her research has focused on magnetic resonance imaging approaches to improving the prediction of cardiovascular risk.
The team was able to form predictions for this study using the distinctive MRI records of patients who had survived a heart attack, but as a result now have damaged cardiac tissue that predisposes the heart to life-threatening arrhythmias. The study was blinded, meaning that the team members were unaware of how closely their forecast matched the events that happened to the patients in real life until after the study. Data was taken from 41 patients who had survived heart attacks and had ejection fractions of less than 35 percent.
The ejection fraction is a measure of the amount of blood being pumped out of the heart, and physicians typically recommend implantable defibrillators for all patients with ejection fractions of less than 35 percent. All 41 patients received the implants, but research has shown that the ejection fraction score is a flawed measurement. It does not accurately predict which patients are at high risk for a sudden cardiac death.
The Johns Hopkins team used pre-implant MRI scans to create patient-specific digital replicas of their hearts. With the help of computer-modeling techniques, each replica was brought to life with representations of the communication among cells and the electrical processes in the cardiac cells. In some cases, the heart developed an arrhythmia, and in other cases, it did not. This non-invasive way to gauge the risk of sudden cardiac death has been dubbed VARP, which stands for virtual-heart arrhythmia risk predictor.
The VARP results were later compared to the post-implantation records of the defibrillator recipients in order to see how well the technology predicted life-threatening arrhythmias that were detected and halted by the implanted defibrillators. Patients who tested positive for arrhythmia risk using VARP were four times more likely to develop arrhythmia than those who tested negative. VARP also predicted the arrhythmia occurrence in patients four-to-five times better than the ejection fraction and other predictors that are currently used, both invasive and non-invasive.
Implantable defibrillators come with risks including device malfunctions, infections, and, in rare cases, heart or blood vessel damage. These risks can be eliminated by avoiding implantation of the device in patients for whom it is not truly needed.
This is a big step in helping patients with heart arrhythmia live long and healthy lives. Cardiologists often get a lot of data about patients, but ultimately don’t use much of it for individualized care. With the groundbreaking technology that allows doctors to create a personalized virtual 3D heart, cardiologists can test an individual patient’s heart virtually without doing invasive procedures.
Hospital-related errors are the third leading cause of death, according to a revealing study from earlier this year. As uncomfortable as that may be, the numbers don’t lie. So what puts hospital care in such a precarious situation?
Listen— One way to cut medical errors: keep patients at home; by Dan Gorenstein of APM’s Marketplace
It’s important to remember is that many of those who suffer from the hospital errors are the elderly. In hospitals, falls may be more common and that can be disastrous for a frail body. There’s also the refusal to eat because of a dislike of the hospital food. But the most dangerous of all is the specter of superbugs, diseases that are immune to many conventional anti-vitals and antibiotics. These are all unfortunate factors that contribute to a problem with mortal consequences, but one New York City hospital is looking into changing that.
Mt. Sinai is taking steps to fight hospital errors by taking patients out of the hospitals. Studies have shown that patients who receive at home treatment are 19% more likely to be alive than those that receive it in a hospital wing. This is especially true for older patients, who may suffer from mobility issues that make it advantageous to be present in familiar surroundings.
At first, one might say that Mt. Sinai’s “Hospital at Home” program is unprecedented, until they realize that it wasn’t that far in the past when doctors or nurses would come to a patient’s home in order to examine or treat them. In addition to the safety benefits that at-home visits offer, they are cheaper, too.
Right now, most the home-care programs are extremely limited because medicare is not quite ready or willing to cover the associated costs. But with two decades’ worth of research backing up Mt. Sinai’s program, the government is keeping it on notice. In addition to lowering overall costs, home treatment reduces readmission and keeps client satisfaction high. There’s also a a logistical challenge as well— getting all of that equipment into a New York City apartment isn’t easy,
As good as this idea is, there is yet another problem: not all patients will be able to care for themselves after treatment. Some ailments may require specialized help; others may just be unable to perform certain routines that were possible in their youth. If too many people are readmitted, then they stand to lose federal funding. On the other hand, high expenditures on keeping patients healthy at home may render the hospital unable to realize its savings. So, Sinai needs to draw then line between who can reliably benefit from at-home care, and who should stay in the hospital bed. It’s quite the challenge, but if the kinks are worked out it could save lives.
Source: APM healthcare
Artificial intelligence may still seem foreign to some people, but IBM is determined to make it the future of healthcare, reports ZDNet.
Since their AI Program, Watson, defeated human opponents on the game show Jeopardy!, the tech giant has funneled resources into making their supercomputer work for people, not against them. There’s both high expectations and space for it to see continued growth. In the years since IBM’s announcement, AI has been valued at as much at $600 million in 2014, and is expected to be worth $6 billion by 2021.
A shift has happened. AI is no longer an idea that needs preliminary exploration. We’ve seen what it can do by now. As Frost & Sullivan’s Venkat Rajan says, AI is moving towards commercial adaptation.
The growing demand for AI systems isn’t for novelty or the embrace of future technology for its own sake. Instead, the demand is facilitated by necessity. Doctors alone cannot keep tabs on the ever increasing body of data surrounding patients and their individual statistics, and the traditional management systems are becoming increasingly inefficient. It’s easy to see how overburdened the human mind can become with a glut of information Beyond that, error could lead to potentially life altering or life-ending consequences. AI can not only retain all of this data, but it can also use it to recommend certain treatment options. These possibilities are furthered when one considers that patients may move and change physicians. An AI system could have all of their previous health data, trends, and analysis at the ready to assist the new doctor.
IBM may be leading the AI revolution, but Google is looking towards it, too. Their healthcare system, DeepMind, has no AI component yet, but is expected to have one in the future.
Another crucial role that AI could play is in disease prevention and early detection. By analyzing a patient’s health history, lifestyle, and genetic predisposition, a system can record the optimal treatment options— be if medication or a lifestyle change.
AI could also be involved in drug development. With all of that data on hand, it could be easier to identify which population segments may be best suited for a trial, or provide some insight as to why one population sample reacted to a drug better than another one did.
AI doesn’t only need to be for doctors, it can help patients as well. You know how sometimes you go to a doctor wondering if a seemingly benign system could be a sign of something more serious? AI could help with that.
Of course, AI won’t be perfect— there are going to be underlying privacy issues and liability risks to be dealt with of course— but you should expect to see it play a role in the future.
There’s no shortage of ad campaigns out there encouraging us to “think big”— or, in the case of Apple, to “think different”. But these expectations are wildly unrealistic, and can lead to unwanted drawbacks. This is especially true in health care.
Earlier this week, Dartmouth’s Chris Trimble explained why this is the case. In a piece for the Harvard Business Review, Trimble argued that the best way for health care systems to improve is not in leaps and bounds, but in measured increments brought on by “small, dedicated teams”.
Trimble turns to history to make his point. Back in 1998, Essentia Health charged a small, dedicated team with the task of improving how health care was delivered to recently discharged patients who were battling congestive heart failure. CHF patients often lack ongoing care or education to keep their condition in check, and all too often wind up right back in the hospital. In Essentia’s case, their small team met with patients, fostered a relationship, and dispensed valuable medical advice to keep them healthy. Over time, this full-time team grew in size and has experienced major success. In the nearly two decades since it was formed, discharged patients have been able to lead healthier lives, which leads to fewer trips for them back to the hospital. But even though the Essentia CHF team had clear benefits for both patients and hospital staff, few in the healthcare community treated the initiative with the praise it deserved.
Because fee-for-service reimbursement reigns supreme, and under that paradigm, innovations like Essentia’s— characterized by small, full-time teams— lose money. And even though these innovations could be incredibly helpful to the industry, few companies are willing to tap into their potential. The ultimate reason, writes Trimble, is that we seem to be stuck in an all-or-nothing approach to innovation.
He goes on to share a cool graph that illustrates the nature of changes. On one end are small changes in productivity and the minutiae of labor. He calls them “frontline process improvements”. Typically, they are small projects that only involve staff on a part-time basis. On the other hand are projects that “think big”— they disrupt, change the way the entire industry does business, or run with a radical new idea. But Trimble asserts that even though such ideas are the desire of all, the pursuit of them is detrimental to the organization for two reasons: they involve full time staff and often call for immense capital and resources— resources some institutions rarely have.
Instead, Trimble says that we should be focused on the opportunities for innovation that rank in the middle of that spectrum. A perfect example is the Essentia case that he opened with. At the end of the day, any organization is going to want results. And while dreaming of new infrastructure or equipment is a wonderful thing, the chances of it happening every time is slim. Conversely, those small changes that make existing practices better are a more realistic option. It has the benefit of lower cost and a greater chance of actually delivering results.
Next time your organization starts to “think big”, reel it in and pass the torch to a dedicated team that is committed to making what you already have that much better. Sometimes the biggest innovations begin with the smallest changes.
In the past few years, large companies have been scrambling to harness and interpret the data that is linked with our lives. Of course, we are familiar with how this relates to our streaming video habits and social media preferences, but the human body also produces huge quantities of data. From the number of steps we take, to our heart rate, to blood sugar, to the amount of calories we consume, our very existence is the product of an intricate numbers game. So, there’s no question that the healthcare industry is exploring how wearables could impact patient treatment. Wearables are officially a thing, and as we move from the “what if” to the “how to”, MedCity News has compiled a list of four things to look for in healthcare wearables.
When Apple introduced HealthKit in 2014, a new world of data sharing was made available to app developers. The HealthKit framework was unique in that it allowed compatible apps to talk with the native Health app, and share data with other developers and devices without the need to form partnerships. Right now, HealthKit is being used to track users’ step counts and calorie intake. But ever since Apple made the push for their framework to be integrated into clinical uses, we could see apps go beyond reminding you to drink more water and instead play a role in diagnosis and treatment.
Shifting to Actionable Information
We have gotten very good at collecting data. Our phones can act as pedometers, and little wristbands can give us feedback on the quality of our sleep. However, this huge list of data is often just that— a static list. The next question is figuring out how we can use that data to leave fuller, healthier lives. Rapidly advancing analytics seem to be taking that first steps, as data is being cross referenced and checked to give us personalized insights and instructions on how to better our quality of life.
The data collected from wearables could easily be used by insurance companies to assess risks and create incentive based programs for healthy behavior. They can also be used for improving patient care, as physicians could monitor several aspects of a patient’s well-being in real time. Lastly, the pharmaceutical industry could take advantage of these new technologies by tracking client responses to new drugs, refining and expediting the FDA approval process.
Improved Patient Monitoring
This might be the most important application, especially for persons with chronic illnesses or the need for a caretaker. Remember that old Life Alert commercial? Wearables and the data gathered from the body could streamline the process of getting assistance when it’s needed most. For example, a monitored patient whose blood sugar takes a sharp drop, or whose heartbeat is growing increasingly erratic could get medical help as soon as those symptoms show themselves.