Maybe you’re one of the estimated 50 to 70 million Americans who suffer from sleep disorders, including insomnia; maybe you’re also among the 4 percent of American adults who rely on prescription medication in order to fall asleep. If so, Matt Walker, a professor of neuroscience and psychology at the University of California, Berkeley, has a bit of bad news for you.
In a section of his new book, Why We Sleep, Walker explores the latest scientific research to show the unfortunate truth about sleeping pills: They don’t work as well as we wish they did. Sleep medications don’t deliver the same restorative benefits as natural sleep, and even though people who take them often swear by them, the research suggests that the drugs don’t tend to increase sleep quality beyond placebos. Currently, Walker says, the best available treatment method for combating chronic sleeplessness is not pharmacological at all; it’s psychological.
Recently, we spoke with Walker about this aspect of his book, including his skepticism over sleeping pills and his enthusiasm for cognitive behavioral therapy for insomnia, or CBT-I. What follows is a lightly edited and condensed version of our conversation.
There’s a lengthy section toward the end of the book that discusses your concerns about sleeping pills. What’s most worrisome about them to you?
The quality of sleep that you have when you’re on these drugs is not the same as normal, naturalistic sleep. They’re classified as “sedative hypnotics,” so the drugs actually just sedate you — and sedation is not sleep.
And you argue that this distinction, between natural sleep and sedation, is why sleeping pills don’t deliver the same benefits of sleep. Right?
That’s right. The way that they work is by targeting a set of receptors, or “welcome sites,” in the brain that are lured to basically stop your brain cells from firing. They principally attack those sites in the cortex, this wrinkle of tissue on the top of your brain, and they just switch off the top of your cortex, the top of your brain, and put you into a state of unconsciousness.
Sleep, in contrast, is this incredibly complex ballet of neurochemcial brilliance that results in numerous areas of the brain both switching on and switching off. We don’t have any good pharmacological approach right now to replicate such a nuanced and complex set of biological changes.
My second problem with sleeping pills: They don’t tend to increase sleep much beyond placebos. People may be fooled into thinking that they’re getting more sleep, but actually they’re not. This was not my conclusion — it was a committee of experts, who reviewed 65 separate drug placebo studies, and their conclusion was simple: There was no objective benefit of sleeping pills beyond placebo. Their summary was that the impact of sleeping pills was small, and of questionable clinical importance.
And my third problem: They are associated with a higher risk of death and cancer.
Read more at The Cut.
The most iconic image of midcentury American architecture is arguably Julius Shulman’s photo of the glass-walled Case Study No. 22 house in Los Angeles, which appears to float weightlessly, almost magically above the city. The appeal of the image—which Time magazine called “the most successful real estate image ever taken” (and which was in fact staged with models in cocktail attire)—lies in the way that the silhouetted inhabitants appear to live in another plane, absent any extraneous furnishings or walls, yet safely enclosed and bathed in the home’s light. The luxury the house evokes is neither gaudy nor accessible; it is desirable because of what and who isn’t there—walls, clutter, crowds, or street. Shulman’s photo and the architecture it depicts have in years since helped stoke a mimetic desire for a weightless, minimalist, perfectly curated life, a desire that now drives an entire industry of midcentury real estate, furniture, and associated lifestyle goods.
But midcentury modern homes are increasingly rare and can require expensive repairs, while suburban upper-middle-class homes built after the midcentury period, with their thick walls and frequently Southwest or Mediterranean features, tend to be the formal opposite of the Stahl house. With actual midcentury homes out of reach for most, developers and architects are now attempting to satisfy—and of course sell to—this desire with midcentury-inspired construction. But the new midcentury-inspired home does not look quite like the Case Study house in Shulman’s photo. Comparing Case Study House No. 22 and its ilk to new midcentury-inspired homes tells us not just what was so appealing about midcentury architecture, but also what architecture has lost since that period.
Midcentury modern architecture has been less popular with practicing architects than with homebuyers, since architects are incentivized by their trade and its publications to architect forward, not backward. Several architects I spoke to said that even as the midcentury fervor has grown, many refuse to rebuild the old styles, favoring new work in organic and futuristic forms over repetitions of old designs. According to architect Ray Kappe, who is known for his glassy, transparent midcentury home designs, “most graduates of schools of architecture since the ’50s, ’60s and ’70s have wanted to move architectural ideas forward. They are interested in having their work published in the magazines and books, [and] most publications are presenting other work.”
“We would rather design for this era than a 70-year-old era,” says Palm Springs architect James Cioffi, who worked in the ’70s with iconic midcentury architects like Hugh Kaptur and says he is often called a midcentury architect but doesn’t consider himself one. Cioffi, with other contemporary architects, like Lance O’Donnell, is building new homes in an area of Palm Springs called Desert Palisades. These homes are intended to be truly modern, rather than what Cioffi calls “throwbacks.”
Read more at Curbed.
In 2005, two years after Sameer Sahay arrived in the United States from India to pursue an MBA, he was thrilled when an Oregon health care company hired him and agreed to sponsor his green card. His life as an American, he thought, had begun.
Twelve years later, Sahay, now 50, is still a data architect, still working for the same firm, and still waiting for that green card. It’s not clear when he’ll clear the government backlog. He does know that his provisional status stalled his career — changing jobs would have required the company to file a new petition.
“Personally, I have sacrificed my career to help my family to have a better life,” Sahay says. “That has taken its toll. Had I gotten a green card, I could have moved on, moved up, done a lot more things. This held me where I was 10 years ago.”
Tangled and contradictory immigration policies of this sort have frustrated Indian immigrants for years, but the United States was seen as a prize worth pursuing. Now, though, many Indians — long a vital pillar of U.S. hospitals, tech firms, and engineering efforts — are reconsidering their options. Despite a chummy Rose Garden meeting between U.S. President Donald Trump and Indian Prime Minister Narendra Modi in June, the permanent legal status of many Indians in America has become far more uncertain since Trump’s election.
In the president’s short time in office, his promises and policies — from the “Muslim ban” to a directive that may alter who gets a work visa — have convinced many foreign nationals that they are not welcome. For many of the 2.4 million Indian nationals living in the United States, including roughly 1 million who are scientists and engineers, the fears are existential; although roughly 45 percent are naturalized citizens, hundreds of thousands still depend on impermanent visas that must be periodically renewed.
Changes in the U.S. skilled visa scheme could trigger large economic and intellectual losses, especially in states with many South Asian residents such as California and New Jersey. Some foreign nationals there wonder if Trump’s policies will trigger an Indian brain drain.
Read more at Business Insider.
“Blade Runner 2049” is going to struggle to make it past the $100 million mark at the domestic box office, hardly the response Warner Bros. was looking for given the film’s estimated $300 million production and marketing budget. In a way, the odds were always against “2049” given that its predecessor was also a financial disappointment and only went on to become a cult classic with a very specific demographic of moviegoers. “Blade Runner” is no multi-generational favorite a la “Star Wars” or “Jurassic Park.”
But while the sequel is a box office dud, it’s unquestionably a huge step in the right direction for studio filmmaking.
In a blockbuster age dominated by comic book fare and endless cash-grabbing sequels, it has become increasingly rare to see a big-budget studio film driven not by mind-numbing spectacle or the demands of universe-building but by an auteur’s singular vision. “2049” lacks the epic action set pieces that define Marvel movies, but it has a kind of patience and cerebral edge any superhero movie wouldn’t dare touch. The film has a gun fight or two, but it’s largely made up of characters reflecting on their own shifting perceptions of what it means to be human.
Making a blockbuster like “2049” in 2017 is a huge risk, but it’s the kind of risk studios need to keep taking. Director Denis Villeneuve was able to make a pure Villeneuve movie for $300 million, and that alone should be celebrated by cinephiles, regardless of the film’s financial outcome.
Villeneuve did the exact same thing just last year with “Arrival,” another cerebral slice of science-fiction that traded in action scenes for thought-provoking human drama. “Arrival” was made for a fraction of the cost of “2049,” but it was a similar creative risk. Villeneuve made an alien invasion movie and didn’t destroy a single skyscraper; instead, he pieced together the past and future memories of a grieving mother, which isn’t exactly your typical major studio release. “Arrival” ended up grossing over $100 million and earning eight Oscar nominations, including Best Picture and Best Director, but the film is the exception, not the rule. More times than not, allowing a director to see his or her vision through without studio censorship will have a polarizing result with fans and at the box office. Just look at what’s happening to “2049” or what happened to Darren Aronofsky’s “mother!” earlier this year for proof.
Fortunately, the post-“2049” future for studio films looks somewhat bright, and it appears we have the science-fiction genre to thank for that. More so than any other genre right now, science-fiction has become the one area where major Hollywood studios seem comfortable taking a risk and giving an auteur the budget he needs to try something bold and different. We saw it with “Arrival” and “2049,” and we even saw it with Matt Reeves’ more elegiac and mournful “War For the Planet of the Apes” (which also struggled at the box office over the summer). Usually, audience would have to go indie if you wanted to see challenging sci-fi (“Ex Machina,” “Coherence,” “Primer,” and “Moon” being some examples), but it looks like that’s no longer the case.
Read more at IndieWire.
William of Occam would have hated conspiracy theories. A 14th-century philosopher and Franciscan friar, William is celebrated for developing the “law of parsimony,” better known today as “Occam’s razor.” According to the razor principle, the simplest explanation for an event is almost always the best; shave away any extraneous assumptions, and what you’ve got left is usually the truth.
That’s not exactly the way conspiracy theorists think. Either Barack Obama was actually born in Hawaii, or an international plot unfolded over multiple decades to conceal his Kenyan birthplace and install him in the presidency. Either vaccines are safe and effective, or every major hospital and health organization in the world is covering up the fact that they actually cause autism. Never mind the razor — conspiracy theories are nothing but extraneous assumptions.
The question is, Why do so many people believe in them? Why do even the most preposterous theories — the Nazis survived but they fled to the moon; the world is secretly being run by a reptilian elite — have fiercely loyal adherents? There are nearly as many explanations for conspiracy theories as there are theories themselves, but some patterns do appear again and again.
The most common theories are the ones that follow the eddies of politics. As a broad rule, a party or group that’s out of power will be more inclined to believe in conspiracies than a group that’s in power.
“Conspiracy theories are for losers,” says Joseph Uscinski, associate professor of political science at the University of Miami and co-author of the 2014 book American Conspiracy Theories. Uscinski stresses that he uses the term literally, not pejoratively. “People who have lost an election, money or influence look for something to explain that loss.”
So consistent and predictable is this phenomenon that in the U.S. at least, leading conspiracy theories flip almost the moment the presidency does. When Bill Clinton was President, the principle conspiracy tales involved stories of Clintonian cocaine dealing in Arkansas and the alleged murder of Presidential friend and confidante Vince Foster. Once George W. Bush took over, so too did new conspiracy fables, this time involving Vice President Dick Cheney, Halliburton energy and the Blackwater protection company masterminding the Iraq war in order to seize the nation’s oil.
Read more at TIME.
Skis that fold in half, wool that glows, and a smokeless fire.
CAT7 Connect Bat
Most bats approved for college and high-school leagues feature only an inch-wide sweet spot—that ideal location where the ball will rocket off fastest. Marucci machined this one with walls of varying thickness; each section, including the large center of percussion in the middle of the barrel, is calibrated to give the ball just the right trampoline effect. A vibration dampener keeps the impact from stinging your hands. $350.
Read more at Popular Science.
If you believe Jennifer Hyman, CEO of Rent the Runway, her company is a major disruptor in fashion. It rents out designer clothes from some 500 different brands to subscribers who pay a monthly fee, allowing them to borrow a high-end wardrobe for much less than it would cost to actually buy. The company has thrived, topping $100 million in revenue last year and becoming profitable for the first time.
One particular segment of fashion retail should be afraid, Hyman says, and that’s fast fashion. “I plan to put Zara out of business,” she told Glossy after Rent the Runway announced a new, lower-priced subscription plan that will make the service accessible to more customers. The new $89 plan allows subscribers to borrow four items per month. The standard plan, which offers unlimited items per month, is increasing to $159, but now allows customers to borrow four items at a time, rather than the previous three.
The narrative of Rent the Runway’s narrative as a fast-fashion killer (paywall) is one Hyman has been pushing for some time. She raised the point last year, for instance, when speaking with Quartz about a partnership the company established with Neiman Marcus. As Hyman sees it, people shop fast fashion for trendy items, while they invest more in classic wardrobe staples that they’ll keep for years.
It makes sense, then, that people would prefer to rent fleetingly fashionable items instead of buying them. Consumers are also increasingly open to renting what they want through a service, rather than owning something outright. Think of the way streaming services such as Spotify have changed the music industry and eliminated the need to actually own the music you listen to.
But there’s more than a little hyperbole in Hyman’s comments about putting Zara out of business. For starters, Zara pulled in €15.4 billion (about $18.1 billion) last year, across 93 global markets. Rent the Runway’s revenue, while growing fast, is still a drop in a bucket that size—and the company only operates in the US. According to research firm NPD’s Checkout Tracking data on e-commerce apparel, which analyzes actual spending by consumers who have opted into their panel, only 5% of Zara’s US online buyers also subscribe to Rent the Runway.
Still, Rent the Runway’s success does raise some questions about the future of shopping: Could clothing rental, including Rent the Runway and the growing crop of similar businesses, steal enough customers from fast-fashion brands in the US to make a noticeable dent in their sales?
Read more at Quartz.
Tank, a veteran singer with a decade and a half of R&B hits, remembers the moment when rappers took over the airwaves.
“Bone Thugs-n-Harmony are the Jesuses of melody rap,” he says. “What they did was theirs at the time; nobody could touch it, so nobody did. But when Nelly came around [in 2000] with hip-hop fully infused with melody, that’s when people started to take notice. Then Ja Rule came. It’s like, ‘Hey – you’re in my lane!'”
“We didn’t stop and realize what was happening,” Tank continues. “With hip-hop growing and taking over at the rapid pace that it was, I would say us R&B guys couldn’t compete – and we didn’t compete.”
The ascent of rap on mainstream radio has had wide-reaching consequences for R&B, fundamentally changing the types of voices you hear in the genre’s mainstream. Historically, singers with a mastery of clean, high tones – from Patti Labelle to Deniece Williams to Ralph Tresvant to Usher – flourished next to singers who favored lower, rougher registers, artists like Barry White, Chaka Khan, Anita Baker and Toni Braxton. This variety allowed for a breathtaking range of expression: No other genre celebrated as many fine gradations of the human voice as R&B. But as melodic rappers became ever more dominant, the lower-register R&B singers largely disappeared from the mainstream, and young singers hoping for mainstream success began staying away from deeper tones and rougher textures.
“I’m in my high interview voice so I won’t frighten you,” jokes Braxton, whose low vocals graced multiple platinum-selling records during the 1990s. “I’m prejudiced because I’m a contralto, but I don’t hear many of them anymore.” Kuk Harrell, a vocal producer for superstars like Rihanna and Usher, offers a similar observation. “I really do miss that lower-register voice,” he says. “That’s not to say we don’t have great emotions out of higher-voiced singers, but that particular thing is not here.”
“A certain grit went into something else,” adds the singer Bilal. “Ain’t nobody singing like Teddy Pendergrass no more.”
Why did the doors close for deep and gritty vocalists, who were an important part of R&B’s mainstream as the genre progressed through soul, funk, Quiet Storm, disco, Eighties synth fusions, house music and the hip-hop-inflected mutations of the Nineties? More than 20 conversations* with artists, producers, label executives and radio programmers indicate that low-register R&B singers were squeezed on two sides at the turn of the millennium: First, rappers took over the vocal ranges that once belonged to R&B, and then struggling labels abandoned R&B groups, which traditionally supported a wide variety of voices. These shifts were compounded as mainstream radio stopped playing R&B songs, which limited the avenues of exposure for all R&B singers but especially hurt those who favor low, throaty intonations.
Read more at Rolling Stone.
DALLAS (AP) — At some point during many flights, the captain will calmly announce that there could be some bumps ahead and so passengers must be seated with their seat belts on.
The plane might seem to bobble or bounce a bit, but rarely does it turn into a serious threat to safety. That, however, is just what happened to an American Airlines flight last weekend, when 10 people were injured as the plane plowed through turbulence on its way to landing in Philadelphia.
A rundown of statistics, recent incidents, and what pilots and airlines do to avoid hitting potholes in the sky:
About 40 people a year are seriously injured by turbulence in the U.S., according to Federal Aviation Administration figures from the last 10 years. The FAA counted 44 injuries last year, the most since more than 100 were hurt in 2009.
But the official count is almost certainly too low.
The National Transportation Safety Board requires airlines to report incidents that result in serious injury or death, and FAA uses those reports to tally the number of people hurt by turbulence. But airlines are not required to report injuries unless they require a 48-hour hospital stay or involve certain specific injuries such as major broken bones, burns or organ damage.
Saturday’s American Airlines flight to Philadelphia likely won’t meet those standards — the injured people were released from the hospital within a few hours and didn’t suffer the types of injuries that trigger a report to the federal safety board.
Read more at Popular Mechanics.
Edvard Munch turned his mental struggles into spectacular work. But do artists need to be tortured to achieve greatness?
Look out for the quiet kids who hide in the school library. They’re looking for answers — and if you’re not careful, they might find some. My school had a huge old library full of recesses where an enterprising reader could stay out of sight of the big kids. That’s where I went to work out my mad moods in the best tradition of pretentious teenagers everywhere. Like any fretful pubescent who ever had an anxiety disorder and more black eyeliners than friends, I felt alienated, ashamed — and utterly convinced that nobody in the history of the human race had felt quite the same way. Until I found the books that told me otherwise.
There’s sorcery in that sudden sense of kinship when you discover a piece of art or writing by a stranger from a different time who nonetheless knows exactly how you feel, especially when you’re at the age of accelerating into adulthood with the rickety thrill of a rollercoaster you can’t get off.
Crazy dead poets really got me. Or at least, I got them. They may not have known the indignities of having to wipe the spit off your hair after another morning on the school bus, but they knew what it was like to feel like your body did not belong to you, to be overwhelmed by nameless dread in the middle of a normal day, or to wonder if you were going bonkers.
So I read Sylvia Plath. I worshipped Francis Bacon and Arthur Rimbaud. I kept a postcard version of Edvard Munch’s “The Scream” tucked into my school diary, next to my list of what to do when I thought I might be about to hyperventilate myself to death, including memorize three French verbs and find a novel dark and weird enough to hide inside.
The attraction was obvious, and it was ordinary: those tortured artists made the torture seem, well, rather artistic. I came away with the impression that mental illness was a necessary adjunct to genius. That it made you somehow special. That it was a little bit cool. The really serious writers I loved all seemed to have had bipolar disorder: I found myself wishing that maybe I could have it, too.
Shortly afterwards, I was diagnosed with an entirely different disorder, and ended up in hospital.
I never wished for mental illness again. I wouldn’t wish it on anyone.
It didn’t make me an artist. It didn’t give me special insight. It just made me very sick, and very sad, and came close to making me very dead.
Read more at Anxy.
The future of health care in the U.S. is far from settled, but how people receive it now is also undergoing a revolution. Health records are antiquated, there’s a shortage of primary care physicians and access to birth control and emergency contraception is limited in some places.
Health and technology companies operating largely outside of the standard health care system are attempting to solve these and other problems with alternative approaches. Whether they have staying power remains to be seen, but here are four compelling methods on the rise.
Video chatting your doctor
It can take more than three weeks to get an appointment with a new doctor, but now, people in all 50 states can visit a physician through their smartphone. “Telemedicine has been touted as the next big thing for several years, and I think it’s finally getting to a stage where adoption is kicking in,” says Hill Ferguson, CEO of Doctor On Demand, an online video chat app. The app provides a platform for more than 1,000 doctors and more than one million users, Ferguson says.
Whether that means obtaining a prescription from your couch or chatting with a therapist for 10 minutes at the office, video chat visits are becoming increasingly common: By 2020 there could be an estimated 45.6 million virtual consultations performed in the U.S., according to data and analysis company IHS Markit. Ad option of telemedicine in health care has increased from about 54% in 2014 to 71% in 2017, and the use of telemedicine in health care increased 9% from last year, according to April 2017 research from HIMSS Analytics, a global healthcare IT market research group.
Read more at TIME.