This Week In Science & Technology

Grand Potentate

Supporter of Possible Sexual Deviants
We never really talked much about science and tech over there so I wanted to start posting interesting tidbits of what's going on in the world.

I'm forecasting this is going to be the death-knell of Cable TV:

The FCC has ruled that cable operators are now allowed to encrypt their basic channels, requiring a setup box for even the local news stations:

1. With this Report and Order (Order), we amend our rules to allow cable operators to encrypt
the basic service tier in all-digital cable systems if they comply with certain consumer-protection
measures. As discussed below, this rule change will benefit consumers who can have their cable service
activated and deactivated from a remote location. By allowing remote activation and deactivation, we
expect our amended rules will result in benefits to both cable operators and consumers by significantly
reducing the number of truck rolls associated with provisioning service and significantly reducing the
need for subscribers to wait for service calls to activate or deactivate cable service. At the same time, we
recognize that this rule change will adversely affect a small number of cable subscribers who currently
view the digital basic service tier without using a set-top box or other equipment. If a cable operator
decides to encrypt the digital basic tier, then these subscribers will need equipment to continue viewing
the channels on this tier. To give those consumers time to resolve the incompatibility between consumer
electronics equipment (such as digital television sets) and newly encrypted cable service, we require
operators of cable systems that choose to encrypt the basic service tier to comply with certain consumer
protection measures for a period of time. In addition, we note that this rule change may impact the ability
of a small number of subscribers that use certain third-party equipment that is not CableCARD
compatible to access channels on the basic service tier. To address this issue, we require the six largest
incumbent cable operators to comply with additional requirements that are intended to ensure
compatibility with certain third-party-provided equipment used to access the basic tier.
From Consumer World:
Thousands of Boston-area RCN cable subscribers awoke today to find that their high definition televisions had become expensive doorstops. Overnight, the company encrypted the signals of all local basic TV channels rendering any televisions without a cable box completely inoperative. RCN is believed to be one of the first cable operators in the country to implement a little-known FCC ruling by encrypting basic channels, scheduled to start today in Massachusetts, New York, Pennsylvania, and Washington, DC.
Last October, the FCC ruled that any all-digital cable companies could scramble the signals of local TV channels which, up until that time, had been required to be transmitted "in the clear." That meant that basic channels could be picked up without the need for a set-top box by any subscriber with a high definition television equipped with a built-in QAM tuner.
"I would imagine that thousands of RCN customers were shocked when they turned on their televisions today and didn't have a picture because they didn't prepare for the change," commented Consumer World founder Edgar Dworsky.
In March, as required by FCC rules, RCN sent subscribers a 30-day advance notice indicating that as of April 10, all televisions connected to their service would be required to have a set-top box. [See: ] To ease the transition, the FCC ordered that cable companies provide customers with one or two "devices" free to decrypt the new signals for a period of up to two years, depending on the customer's level of service.
Nothing in the FCC rule required that these free boxes transmit a high definition signal that customers were enjoying up until today without a box. Accordingly, RCN's website states that subscribers can only receive a standard definition cable box free. After the free period, subscribers will be forced to rent a box (currently about $10 a month for an HD box), buy a box if any come to market (but it will require the monthly rental of a CableCARD), or go back to using rabbit ears. Some RCN customers, however, have been able to negotiate a free HD box for one year.
The new rule is expected to primarily affect secondary televisions in many households - the ones in the den, kitchen, bedroom or guest room -- which are less likely to have a cable box or DVR already attached to them. According to Nielsen, 65.9 million households have three or more TVs. Dworsky himself has three HDTVs and a TV tuner in his PC that will each require a new cable box.
Although Comcast has not yet encrypted its basic channels, the company is expected to do so as soon as its digital transition is complete, since it was one of the companies that had lobbied the FCC to allow scrambling of basic channels.
"The change in the rules is going to trigger a significant added expense for cable customers with multiple TVs," said Dworsky. "Who would think that in 2013 we would have to resort to going back to using rabbit ears just to view local television channels. Thanks, FCC."

You're now going to need an antenna to get your local signals without a cable box.
NOAA thinks the arctic will be ice free by 2050:

Previous studies have projected that the Arctic could see its first ice-free summer sometime between 2015 2100. The new meta-study (a study of 36 previous computer models used by the UN and others to predict Arctic climate change) published this week in the journal Geophysical Research Letters, indicates that the answer is likely right in-between those extremes. The study attempted to narrow the discrepancy between previous ones by looking at more recently observed data. "Eighty percent of the [previous] models are too slow to start with, and probably more than that." Overland told The Verge.
I worked in broadcasting for 27 years. This won't be the death knell of cable. Satellite is already doing this.

The real reason never gets talked about: we are out of spectrum. With the growing demand or mobile wifi and wireless, the Feds have been forcing HD in radio and TV to free up the existing FM band (TV is at he bottom of FM). They'll be selling hat spectrum for new use. Cable and DISH gets broadcasters off the natural broadcast spectrum.

Yes, on a technical sense I agree with you 100%. I should have clarified - I meant more from the consumer perspective: people cutting cable all together or even more switching over to satellite. People are already pissed off with their shitty cable service. Now being forced to buy boxes just to watch the news? Torrents, antennas, and Aereo are going to get a lot more popular.
Fellas, if you haven't gotten the end of your pecker lopped off, now's the time:

Multiple studies have shown that circumcision decreases a man's risk of contracting HIV by as much as 60 percent, but there's been no concrete explanation as to why that is. New research, published yesterday in the journal mBio, supports the theory that this is due to decreased levels of bacteria. The paper shows that circumcision dramatically lowers the amount of bacteria in the area, which could be the reason for the increased resistance when compared to uncircumcised males.
The research team responsible from the study at the Translational Genomics Research Institute in Arizona looked at the levels of bacteria in over 156 uncircumcised men aged 15 to 49. Half of the volunteers were then circumcised, while the other half were not. After one year, there was an average 81-percent reduction in bacteria in the circumcised men. The team concludes that removing the foreskin positively changes the penile ecosystem by increasing the amount of oxygen that reaches the previously-covered skin. This in turn greatly decreases the prevalence of anaerobic bacteria (although slightly increases aerobic bacteria levels), reducing the risk of inflammation that can make HIV infection more likely. Although the team didn't directly look at HIV risks, the research suggests that the lower level of anaerobic bacteria "may play a role in HIV risk reduction."
The fact that circumcised people are less likely to contract HIV represents one of very few cases where modifying the body's natural form can have positive effects on health. Speaking with The Los Angeles Times, Lance Price, a co-author of the paper, explained that "as a society, we've gotten used to thinking about alterations to the microbiome as having negative outcomes," giving the example of antibiotics resulting in a gut infection. "But here's a situation where we're flipping that notion on its head. The disturbance of the microbiome could have a positive effect."

Antibiotic resistance is rapidly becoming a critical public health risk, with a growing number of bacteria developing resistance to a wide swath of drugs. Despite the threat posed by these so-called "superbugs," however, a new report warns that the number of antibiotics being created to thwart them is far too low.
The report, authored by members of the Infectious Diseases Society of America (IDSA), describes the pipeline of new antibiotics as "on life support." Most notably, it notes that only seven new drugs are currently in advanced stages of testing as potential treatments for a particularly nasty class of bacteria: multidrug-resistant gram-negative superbugs. These bacteria include E.coli, Salmonella, and CRE, the latter of which was recently described as a "nightmare bacteria" by CDC director Thomas Friedman, M.D., because it can resist even the strongest antibiotics.
"We're on the precipice of returning to the dark days."
"We're on the precipice of returning to the dark days before antibiotics enabled safer surgery, chemotherapy, and the care of premature infants," said Helen Boucher, M.D., lead author of the report. "We're all at risk."
Pharmaceutical companies aren't investing enough money into the development of new antibiotics, the report noted. That's largely because such drugs, which are taken for a few days or weeks, aren't as profitable as those designed to treat chronic conditions over periods of years. The IDSA wants to see financial incentives that'll coax drug companies back into antibiotic development: only four major pharmaceutical companies still invest in antibiotic research and development, and one of those, AstraZeneca, recently announced plans to reduce its involvement.
Never thought I'd see this out of John McCain, but maybe grandpa is fed up with not being able to watch his sports:

Senator John McCain today introduced the Television Consumer Freedom Act of 2013, legislation that would encourage cable operators and entertainment conglomerates to unbundle channels and offer programming "a la carte." Rather than mandating his desired end result, McCain notes that his bill is completely voluntary, offering incentives that would ideally result in consumers being able to purchase their preferred channels individually. Cable providers (and content companies) have long resisted such ideas.
"This is unfair and wrong."
"Today, we’re putting up a stop sign," McCain remarked during the introduction of his legislation. "My legislation would eliminate regulatory barriers to a la carte by freeing up multichannel video programming distributors (MVPDs) – like, cable, satellite and others offering video services – to offer any video programming service on an a la carte basis." McCain says this would be achieved by linking availability of the compulsory copyright license, which lets broadcasters retransmit programming without obtaining direct permission from copyright holders — with the voluntary offering of a la carte subscription models. "In other words, if the MVPD does not offer a broadcast station — and any other channels owned by the broadcaster — on an a la carte basis, the MVPD cannot rely on the compulsory license to carry those broadcast stations," McCain said.
McCain also seeks to address the issue at the core of a lawsuit between Cablevision and Viacom. "Furthermore, because not all programmers also own broadcast stations, the bill contains a provision that would create a ‘wholesale’ a la carte market by allowing programmers to bundle their services in a package only if they also offer those services for MVPDs to purchase on an individual channel basis," he said. "Thus, if a cable operator doesn’t want to carry channels like MTV, it would have the option of not doing so and only buying, and carrying, the channels it thinks its consumers want to watch." In its complaint against Viacom, Cablevision maintains that Viacom has forced the operator to continue paying for unpopular channels in order to keep programming its customers actually want.
The senator's bill does have some teeth, however. McCain is particularly annoyed with broadcasters who continually lessening the breadth of over-the-air (OTA) programming available to consumers, attempting to push viewers to more profitable options. Networks that engage in such in engage in such behavior would be stripped of their spectrum, with those resources auctioned off by the FCC.
And finally, the Television Consumer Freedom Act of 2013 would essentially wipe out the "blackout rule" that prevents live events from being seen under certain circumstances. Any venue partially paid for with taxpayer money (which would include the vast majority of professional stadiums) would be required to repeal blackout restrictions. “In the end, the Television Consumer Freedom Act is about giving the consumer more choices when watching television. It’s time for us to help shift the landscape to benefit television consumers," McCain said. McCain's proposed legislation is certain to face fierce resistance from the industry.
Isn't cable tv pretty much on the way out? Why wouldn't we be able to directly subscribe to a channel, or even better just directly download and watch the shows we want, over the internet? See for example Netflix original programming.

Cable tv might have a few more years left in it but not much.

I have already made this switch. My TV is plugged into nothing except my broadband router. It's better service than I got from my old cable provider, and cheaper.
Isn't cable tv pretty much on the way out? Why wouldn't we be able to directly subscribe to a channel, or even better just directly download and watch the shows we want, over the internet? See for example Netflix original programming.

Cable tv might have a few more years left in it but not much.

Careful what you wish for. Won't be fun to subscribe to 1000 channels and all have them charge their own fee and have different rules, etc. I heard a roundtable about this recently and while I went in thinking "fuck ya, screw the man" you'd actually be creating 1000 mans and it wouldn't be pleasant.
Probably works for me because I basically only ever use the BBC iPlayer and netflix. I have the equivalent of the iPlayer from the three other terrestrial channels (which are free), but rarely use them. Don't watch a lot of TV anyway.

A team of researchers said Wednesday that it had produced embryonic stem cells — a possible source of disease-fighting spare parts — from a cloned human embryo.
Scientists at the Oregon Health and Science University accomplished in humans what has been done over the past 15 years in sheep, mice, cattle and several other species. The achievement is likely to, at least temporarily, reawaken worries about “reproductive cloning” — the production of one-parent duplicate humans.
But few experts think that production of stem cells through cloning is likely to be medically useful soon, or possibly ever.
“An outstanding issue of whether it would work in humans has been resolved,” said Rudolf Jaenisch, a biologist at MIT’s Whitehead Institute in Cambridge, Mass., who added that he thinks the feat “has no clinical relevance.”
“I think part of the significance is technical and part of the significance is historical,” said John Gearhart, head of the Institute for Regenerative Medicine at the University of Pennsylvania. “Many labs attempted it, and no one had ever been able to achieve it.”
A far less controversial way to get stem cells is now available. It involves reprogramming mature cells (often ones taken from the skin) so that they return to what amounts to a second childhood from which they can grow into a new and different adulthood. Learning how to make and manipulate those “induced pluripotent stem” (IPS) cells is one of biology’s hottest fields.
Stem cells have the capability of maturing into different types of tissue depending on how they are stimulated. Embryonic stem cells (ESCs), plucked from a microscopic embryo, have the greatest potential. With the right molecular nudges, they could theoretically be used to grow new kidneys, lungs and hearts for use by people whose own organs have worn out.
Some experts think that “regenerative medicine” will eventually become an approach to healing that is as important as surgery or pharmacology.
The Oregon researchers, led by Shoukhrat Mitalipov, produced embryonic stem cells through “somatic cell nuclear transfer,” the technique used in 1996 to make Dolly the sheep the first cloned mammal.
The nucleus of a mature cell is transplanted into a human oocyte (egg) whose own nucleus has been removed. After the right stimulation, this new hybrid cell starts to divide and develop just as a sperm-fertilized egg would. When it is at the “blastocyst” stage — about 100 cells — its core contains a small number of embryonic stem cells capable of becoming any type of cell possessed by the human body.
But getting the doctored egg to grow even that far is extremely difficult. For some species, hundreds of eggs must be subjected to nuclear transfer before any produce viable embryonic stem cells. The failure of human oocytes to produce them had led some scientists to speculate that the technique simply might not work in people for some reason.
Mitalipov and several members of his team work at the Oregon National Primate Research Center and had refined their techniques using rhesus monkeys. They used nuclei from the skin cells of newborns or, in some cases, fetuses. Their stimulants included a pulse of electricity at the time of nuclear transfer and the addition of caffeine to the fluid cells lived in.
The tweaks and improvements apparently made all the difference. In one experiment, eight oocytes harvested from one woman produced five blastocysts and four embryonic stem cell lines — a success rate virtually unseen in other animals. The researchers subsequently proved cells were “pluripotent” by coaxing them to become, among other things, beating heart muscle cells.
The experiments were reported in a paper published online in the journal Cell.
“Where the kudos come is in being able to over time enhance and improve the technology developed in other species to make this amenable to the human oocyte,” Gearhart said.
The blastocysts could be implanted in a woman’s uterus. It might develop into a fetus. Most cloned animals, however, turn out to have major health problems and shortened lives.
“We just need to make sure it’s clear to the public that no one in their right mind would want to do that. There is no intent to do reproductive cloning. None at all,” Gearhart said.
Are these embryonic stem cells more versatile than IPS cells made by reprogramming skin cells?
“That’s of interest,” Jaenisch said. But whatever the answer, “the consequence would be to make the IPS cells better.” Given the difficulty of obtaining human oocytes, and the controversial nature of the research, embryonic stem cells aren’t likely to ever be the preferred tool of regenerative medicine, he said.
The Kepler Telescope was turned off yesterday. :challengefailed-71:

Today was a sad day in astronomy as it was announced today that the prolific planet hunting telescope that has single handedly changed our view of the galaxy and universe has abruptly suffered the loss of a second spinning reaction wheel and has been powered down. The mission suspended indefinitely and it could possibly mean the complete loss of the program while in the heart of its mission. The spacecraft launched in 2009 and has been changing our views on the universe almost daily. This, like all spacecraft rely on an array of spinning reaction wheel assemblies to stabilize it and maintain its constant field of view. Kepler was fitted with four of these units and only needs three to remain operational. It unfortunately lost one of its four reaction wheels in July 2012 leaving only the necessary 3 remaining. Now, with the loss of a second reaction wheel, the spacecraft has been rendered unusable. Upon realizing that the spacecraft had, as designed placed itself in safe mode, controllers attempted to initiate the reaction wheels manually but experienced no success. As of now, Kepler and the hopes of finding and confirming exo-planets has been placed in a “parking mode” in order to save mission fuel while solutions are being sought. Hopefully we get some good news though as of now it seems unlikely. NASA Kepler Mission Homepage:
^ thats a very sad and massive blow to science.

Wait until we have real quantum computers. Millions of times faster than anything we have today.

Yup, I was registered on where we classify stars by they period (don't know how to explain well) and then if a stars got a trend it was reviewed by the experts to see if it was an actual planet. That page contributed with the discovery of, last time I checked, 46 exoplanets.

Regadring the quantum computers. There is a video of IBM in which they moved single atoms of carbon and they made a movie. It's awesome. Those are some guidelines to quantum computing. That is goign to be an interesting race... the fisrt country to develop a quantum computer would be able to hack into the most private information of other country. The most advanced security systems of today will be useless to a quantum computer.
Definitely, in addition, just look at the influence these computers could have on international trading markets.
Definitely, in addition, just look at the influence these computers could have on international trading markets.

I wonder how much would be the cost of the fisrt quantum computer. :challengeconsidered-44:
Very cool image of how the Kepler looks for new planets and how small is Jupiter passing by the Sun in comparisson to another stars. Beautiful.


In a major step forward for tissue engineering, surgeons at Duke University have successfully implanted a bioengineered blood vessel into the arm of a patient with end-stage kidney disease. The procedure is the first of its kind in the US, and one of the first such efforts worldwide.
In a major step forward for tissue engineering, surgeons at Duke University have successfully implanted a bioengineered blood vessel into the arm of a patient with end-stage kidney disease. The procedure is the first of its kind in the US, and one of the first such efforts worldwide.
The vessels could replace synthetic grafts
The engineered vein was created with donated human blood vessel cells, which are implanted onto a tubular, biodegradable scaffold. The scaffold supports those cells as they grow into a fully-formed vessel. Once the process is complete, the new vein is "scrubbed" of cellular properties that might trigger an immune response — and subsequent rejection — in a patient. Where kidney disease is concerned, the vessels could replace synthetic grafts used to link an artery to a vein for the process of hemodialysis. Such synthetics are accompanied by serious risks, including clotting and infection.
The technique might one day yield blood vessels for other procedures
"We hope this sets the groundwork for how these things can be grown, how they can incorporate into the host, and how they can avoid being rejected immunologically," said Jeffrey Lawson, MD, PhD, a vascular surgeon who helped develop the veins. Indeed, the technique might one day yield blood vessels for other procedures, namely heart bypass surgery.
For now, the Duke team is focused on conducting several more surgeries on patients suffering from kidney disease: this operation was only the first in an FDA-approved clinical trial that'll evaluate the safety of the veins on a total of 20 patients. And because the veins are engineered to be universal — rather than personalized for each patient — they might one day be mass produced for on-demand availability.
Leprosy, the disease that causes skin lesions and eventually permanent disfigurement, was a constant threat in Medieval Europe, with as many as one in 30 people infected in some areas. But something remarkable happened around the mid 1500s — disease rates dropped sharply, according to historical records, although scientists were not quite sure why. Now an international team of researchers has uncovered DNA evidence that suggests humans rapidly evolved to fight off the disease.
The scientists came up with the theory after examining ancient human remains, scraping the bones for dead Medieval leprosy bacteria. They then managed to take bacteria and reconstruct its entire genome, or genetic map, despite the fact that there was very little genetic material left to work with (less than 0.1 percent). What they discovered was that the leprosy bacteria was similar to modern strains of the disease, which are not nearly as contagious among humans. That led the scientists to conclude that the human population adapted natural defenses against the disease relatively quickly. The results were published today in the journal Science, and may lead to better disease tracking and fighting tools. Leprosy still affects more than 200,000 people annually, scientists noted.
The Supreme Court of the United States just handed down a landmark ruling today when it comes to the practice of patenting genes from the DNA of living organisms. In a unanimous 9-0 decision, the court ruled that "a naturally occurring DNA segment is a product of nature and not patent eligible merely because it has been isolated," invalidating biotech company Myriad's claim to exclusive rights on two breast-cancer causing genes in humans, which the company argued it should be able to patent because it was the first to isolate them and identify their function.
However, the Supreme Court did uphold the legal right for companies and individuals to file patents on synthetic genes, those strands of DNA and genetic material that are modified in a laboratory. That gives the company Myriad a partial victory, upholding its patent on a modified version of the breast-cancer genes it identified and isolated. "We hold that a naturally occurring DNA segment is a product of nature and not patent eligible merely because it has been isolated, but that cDNA is patent eligible because it is not naturally occurring," the court said in its ruling.
"a naturally occurring DNA segment is a product of nature and not patent eligible."
Interestingly, the court also addressed the possibility that natural mutations could lead to a DNA sequence that was very similar to a synthetic one created by a company or scientists, but said that this was not good enough to render patents on such synthetic genes invalid. "The possibility that an unusual and rare phenomenon might randomly create a molecule similar to one created synthetically through human ingenuity does not render a composition of matter nonpatentable," the court said. A lawyer familiar with the case, but who was not involved in litigation, told The Verge that the ruling was narrow as expected, but had the potential to have large ramifications on the biotech industry and medicine.

Natural DNA Cannot Be Patented, Supreme Court Rules
In a decision that could have broad-reaching effects on the future of science and medicine, the Supreme Court that:
— "A naturally occurring DNA segment is a product of nature and not patent eligible merely because it has been isolated."
— But, synthetically created "strands of nucleotides known as composite DNA (cDNA)" are "patent eligible" because they do not occur naturally.
The case, , revolved around Myriad Genetics, a Utah biotechnology company that:
"Discovered and isolated two genes — BRCA 1 and BRCA 2 — that are highly associated with hereditary breast and ovarian cancer. Myriad patented its discovery, giving it a 20-year monopoly over use of the genes for research, diagnostics and treatment. A group of researchers, medical groups and patients sued, challenging the patent as invalid."​
The court's unanimous decision Thursday, Reuters writes, was "a mixed ruling. ... The nine justices reached a compromise by saying synthetically produced genetic material can be patented but that genes extracted from the human body, known as isolated DNA, do not merit the same legal protections."
Writing for the court, Justice Clarence Thomas says that "we merely hold that genes and the information they encode are not patent eligible ... because they have been isolated from the surrounding genetic material."
The US Food and Drug Administration has issued a warning to the healthcare industry, calling for more vigilance when it comes to protecting medical devices from hacking. Anything from Pacemakers to hospital x-ray machines are at risk, thanks to a wide array of lax cybersecurity practices like "hard-coded passwords," out-of-date software, and poorly-protected internet connections. The FDA is calling on medical device manufacturers to "take steps to assure that appropriate safeguards are in place," and is recommending that device makers submit security plans as a part of their FDA approval requests.
"It's safe to say most medical device manufacturers are affected."
Although the FDA says that it's not aware of any injuries or deaths as a result of hacking, a senior official told the Wall Street Journal that "We are aware of hundreds of medical devices that have been infected by malware." Ars Technica points to another report which calls attention to the issue of hard-coded passwords, issued by a group that acts as a liaison between the private industry and government on cyber security matters,. "It's safe to say most medical device manufacturers are affected," one researcher told Ars. However, right now most hacking of medical devices occurs in a more traditional way — many hospital systems use computers that can be infected with traditional viruses like Conficker,
The FDA's recommendations for both medical device manufacturers and hospitals aren't very far off from the kinds of things that consumers have heard for years: changing passwords, monitoring network usage for anything that looks questionable, and keeping software up-to-date. On that last point, the FDA notes that it "typically does not need to review or approve medical device software changes made solely to strengthen cybersecurity."
Unfortunately, the industry may not be inclined to quickly and forcefully address cybersecurity. Big manufacturers have been reticent to acknowledge issues, let alone fix them. "There is a real fear that, along with acknowledgment, comes increased development costs and regulatory oversight," one executive told the WSJ.
Gatekeepers of Cable TV Try to Stop Intel

  • June 12, 2013
WASHINGTON — As Intel tries something audacious — the creation of a virtual cable service that would sell a bundle of television channels to subscribers over the Internet — it is running up against a multibillion-dollar barricade.
That barricade is guarded by Time Warner Cable and other cable and satellite distributors, which are trying to make it difficult — if not impossible — for Intel to go through with its plan. The distributors are using a variety of methods to pressure the owners of cable channels, with whom they have lucrative long-term contracts, not to sign contracts with upstarts like Intel, that way preserving the status quo.
Intel, however, is undeterred, and its executives intend to begin its TV service by the end of the year. They are ready and willing to pay more than existing distributors do for channels. But to date the company has not announced any deals with channel owners.
To Intel, and to some analysts, the behavior by the existing distributors — in some cases giving financial incentives to friendly channel owners, in other cases including punitive measures in contracts — has an anticompetitive whiff. The antitrust division of the Justice Department is looking into the issue as part of a broad investigation into cable and satellite company practices, according to people contacted by the department, who spoke on condition of anonymity because they were not authorized to speak publicly. A department spokeswoman declined to comment.
Public attention about the issue, which gained new life this week during the cable industry’s annual conference here, might also spur the Federal Communications Commission to afford would-be Internet distributors like Intel the same legal protections as those that already exist. The commission has been considering such a change for more than a year.
“The government has to step up and protect these companies, or the incumbents are going to kill them in their cradles,” said Gigi B. Sohn, the president of the public interest group Public Knowledge.
Prospective products like Intel TV, delivered through the broadband Internet infrastructure of Comcast, Time Warner Cable or another provider and sometimes called “over the top TV,” have the potential to radically alter the media marketplace in the United States.
Unlike Netflix, which sells a library of TV episodes and mainly supplements cable, a service like Intel’s — with dozens of channels, big and small, streaming through a modern interface — could cause more consumers to cancel their cable subscriptions. (They would have to keep a broadband subscription, however, unless or until wireless capacity improves.)
It could also stir further innovation within the industry. If Intel’s service ever goes on sale, industry executives predict that others will quickly follow — either because they want to, or they feel they have no choice.
Apple, Microsoft and Sony are often mentioned as possibilities, but the more immediate competition might come from Comcast, Time Warner Cable and other major distributors, which could suddenly compete directly in markets all across the country. Comcast has quietly been working on an “over the top” service for well over a year.
“Suddenly there’d be a whole new world of competition,” said one of the executives, who declined to express support for the “over the top” option for fear of angering the existing distributors.
Most of those companies declined to comment on the record, but some representatives said privately that they are taking common-sense steps to protect their businesses. Each confidential contract between a distributor and a channel owner is different, they said.
Some contracts include clauses that expressly prohibit the channels to be sold to an Internet distributor like Intel, while other contracts merely discourage such competition by including financial incentives or penalties. So-called most favored nation clauses, which are common, exist to ensure that if another distributor receives a cheaper rate for a channel later, that rate applies across the board. Some of these provisions have been in place for years.
But critics said that the contractual language makes it much harder for new companies to enter the marketplace. A Justice Department official said in a presentation last year that “contracts that reference rivals” have the potential to harm competition.
Within the cable industry, the practice of discouraging new Internet distributors has been suspected but not widely documented. The issue attracted new attention on Tuesday during the cable industry’s conference when Richard Greenfield, an analyst at BTIG Research, wrote in a blog post that at least one unnamed distributor had prevented a channel owner from selling to a service like Intel. Whether illegal or not, “it most certainly is bad for consumers, as it limits competition and prevents the emergence of distributors who can provide revolutionary new ways of experiencing” TV, he wrote.
Mr. Greenfield did not name any names, but several channel owners and smaller distributors said Time Warner Cable, the nation’s second-largest cable company after Comcast, had been by far the most aggressive in its dealings with channels. When Comcast acquired NBCUniversal in 2011, it signed a consent decree with the government that prohibited it from trying to block budding Internet distributors. Time Warner Cable declined to elaborate on its practices on Wednesday, but said in a statement that “it is absurd to suggest that, in today’s highly competitive video marketplace, obtaining some level of exclusivity is anticompetitive. Exclusivities and windows are extremely common in the entertainment industry; that’s exactly how entertainment companies compete.” It cited the N.F.L. deal with DirecTV and the Netflix distribution of the former cable show “Arrested Development,” among other examples.
Mr. Greenfield rejected that explanation. “They are not paying for exclusivity,” he said. “They are saying you can sell to X, to Y and Z, but you are forbidden from selling to this new class, called A.”
A spokesman for Intel declined to comment. But this week the company had a suite at a hotel, one block from the cable conference site, and held demonstrations of its service for potential partners. What Intel needs, according to people briefed on their plans, is the support of a critical mass of channels — not the entire universe that Comcast or DirecTV has, but enough to have a viable service. Intel will not introduce the service without that.
Semen solution: can a sperm bank save bees from oncoming extinction?
Bee populations are on a dangerous decline, but a plan to stockpile reproductive fluids might offer the solution
By Katie Drummond on June 14, 2013 09:00

Honey bee colonies around the world are dwindling, and the consequences threaten to be severe: We rely on these bees to pollinate myriad agricultural crops, including one-third of crop species in the US alone, making their disappearance a genuine threat to the human food supply. Now, one team of researchers has come up with a unique technique to prevent what's known as Colony Collapse Disorder (CCD): a honey bee sperm bank.
A group at Washington State University is traveling the world to collect sperm samples from various honey bee subspecies. To do it, they simply squeeze a male honey bee's abdomen, which releases a small quantity of sperm that's collected in a syringe. From there, the researchers plan to cryogenically freeze that sperm, in an effort to develop a robust, diverse library of genetic material from which to conduct cross-breeding.
"Having semen in liquid nitrogen won't solve every problem. But it can help."
The end goal? Something of an ultimate super bee, capable of withstanding many of the environmental and human factors thought to be behind increasing rates of CCD. "Having semen in liquid nitrogen won't solve every problem," Steve Sheppard, an entomologist at WSU and the project's leader, told The Verge. "But it can certainly help."
The problem that Sheppard and his team are trying to solve is, indeed, a daunting one. Earlier this year, a consortium of researchers led by the USDA reported that 31 percent of commercial US bee colonies had been killed off or disappeared the previous winter. Honey bee populations have been declining for decades, but their decimation has been occurring at increased rates since 2006, when some beekeepers reported losing anywhere from 30 to 90 percent of their colonies. Unfortunately, experts have yet to determine exactly how to curb these losses — which is where the sperm comes in.

31% of commercial US bee colonies died or disappeared in 2012
The WSU program could, one day, yield honey bees with significantly different traits from those currently inhabiting American colonies. And that's a big deal: Since 1922, the import of foreign honey bee subspecies — of which an estimated 28 exist worldwide — has been tightly restricted due to concerns about the spread of a bee-killing mite that once destroyed English bee populations. As a result, the genetic pool among American honey bee colonies is extremely small, making it tough for breeders to select for bees whose traits make them more resilient. "Selective breeding is, of course, done with other animals, but it hasn't been done effectively with bees," Sheppard said. "Because logistically, there just wasn't a good option."
In 2008, Sheppard and his colleagues got USDA clearance to start importing the sperm of three foreign subspecies: Italian honey bees that reproduce rapidly, as well as bees from the eastern Alps and Georgian mountains that can withstand cold temperatures and tend to reproduce later in the year — mitigating the risk that cold weather in early spring might kill off an entire brood. The Italian subspecies would suit the needs of beekeepers in the southern US, who need bees that can rapidly pollinate in the early spring, while mountain bees would address the requirements of beekeepers in cooler US climates.
"Being able to stockpile semen is a huge step forward."
But those imports weren't particularly helpful until now: Unless effectively preserved, honey bee sperm is only viable for two weeks, meaning that the team was forced "to have virgin queens ready to be inseminated" as soon as the imported sperm arrived. "Being able to stockpile semen is a huge step forward," Susan Colby, a research associate at WSU, said. "We can collect samples that might not be useful now, but might carry a trait that will be important 20 years into the future."
In addition to bees with diverse climatic tolerance, the team is hoping to cross-breed for resilience. Tougher bees might be able to survive the potentially deadly impact of certain pesticides, which are thought to interfere with honey bees' internal radar and thereby prevent them from collecting pollen and locating their hive. Certain genetic traits can also be targeted to protect colonies of the future from parasitic mites, viruses, and bacterial ailments all cited by the USDA as possible contributors to CCD among US bees. "Instead of relying on miticides or antibiotics to prevent an illness, for example, we can breed bees who resist that illness altogether," Sheppard said.
The team is hoping to cross-breed for resilience
The sperm bank is still in its early stages, with only a handful of preserved samples ready for examination and — if a sample's genetic material is up to snuff — injection into the oviduct of some lucky queen bee. But Sheppard and Colby, who will next week travel to Italy to collect more sperm samples from several colonies, anticipate rapid growth in the years to come. "I don't know how much we'll need," Colby said. "But our goal is to collect as much semen as we can."
Real-Life True Blood: Synthetic Blood Is Coming — And So Are a Host of Potential Complications

  • by Devon Maloney
  • June 14, 2013
Season 6 of HBO’s vampire drama True Blood premieres on Sunday night, presumably following up on last year’s cliffhanger where the factory that produces Tru-Blood — the bottled synthetic blood that allows vampires go “vegetarian” — was burned to the ground, destroying the product that made it possible for vampires to non-violently co-exist with people.
But out here in the real world, the future of synthetic blood is just beginning. After decades of global research, controversies, and failed approval petitions, the UK’s Medical and Healthcare products Regulatory Agency finally gave researchers at the Scottish Centre for Regenerative Medicine the go-ahead late last month to start developing synthetic blood with adult stem cells.
The license allows the researchers to use already-recognized stem cell technology to create a compound that would both eliminate the risk of infusion-transmitted infections and supplement (if not eventually take the place of) chronically limited blood banks worldwide. After years of partial synthetic successes at best, it will permit the first-ever human clinical trials of synthetic blood. Oh, also? The license permits blood manufacturing “on an industrial scale.” Cue the True Blood overture (albeit sans vampires).
And according to Ruha Benjamin, a sociologist at Boston University, the arrival of synthetic blood is also likely to come with some serious socioeconomic and ethical issues, including ones that have complicated many medical advances that before it.
Benjamin is the author of People’s Science: Bodies and Rights on the Stem Cell Frontier, a new book that explores the social forces that inform and arise from scientific research, especially controversial medical practices like stem cell trials. Though her research focuses specifically on paid clinical egg donors in California and New York, the patterns of structural inequality she outlines are in danger of repeating themselves in Scotland – and later, in the rest of the world. The two major quagmires, she told Wired, lie in how clinical trials for synthetic blood are conducted and in the potential patenting of the technology.
Testing Tru Blood

According to its statements in The Scotsman, the Scottish Centre for Regenerative Medicine will produce synthetic blood for the trials using induced pluripotent stem cells – adult cells that can be forced to act like embryotic stem cells. That means they’ll need stem cell donors as well as, later on, transfusion recipients, and neither of those come free.
“Most clinical trials offer some compensation,” said Benjamin. “They don’t call it payment; they consider it a stipend, ‘to offset the burden of participating.’ That means that, for the most part, people who are well off are not participating. People signing up on websites for clinical trials are often working-class men [and] men of color.”
This sort of compensation is substantial enough, she says, that it has resulted in professional guinea pigs: transient “workers” who make a sparse living ($15,000 to $20,000, roughly) by participating in clinical trials. They know where trials are held, who is conducting them, and where communal housing is available to them across the country, and they rack up “paychecks” as they go. A decent percentage of clinical trials include these participants, said Benjamin, at least in the U.S.
“It’s a case in which people who can’t find any other kind of work discover that the little bit of compensation you get through participating ends up being enough,” she said. Those people, again often working classes and people of color, “are willing to bear the risks of the trials.” She says that there’s no reason to think that the participants in the testing of synthetic blood will be any different.
Additionally, clinical trials for any foreign substance like synthetic blood also need to be performed on “pharmaceutically naïve” subjects: people whose bodies aren’t already full of drugs. Again, that often means exploiting the working class and minorities, as well as outsourcing to developing countries.
“It’s a regulatory question [researchers] have to ask: ‘What kind of place will allow us to come in and gather data and not put up as many barriers?’” Benjamin explained. “Often, these tend to be countries that have weaker governments, or that don’t have their own research community that would feel threatened by outside researchers coming in.” This reproduces the global North-South dynamic, sha divide that allows richer, “northern” countries to regularly take advantage of “southern” second- and third-world nations.
Tru Blood to the Highest Bidder

And then there’s the potential danger of ownership. If the researchers at the SCRM choose to patent their technology (the way it would seem the sole manufacturers of the fictional Tru Blood did) they could stand to make a fortune off the stuff–and destroy a lot of potential future research in the process.
Consider the very recent Supreme Court case, Association for Molecular Pathology v. Myriad Genetics, Inc. Myriad had patents on two genes their researchers identified as being hereditarily linked to breast cancer: the genes themselves, not the method of finding them. (If you’ve heard of this case before, it’s probably thanks to the work Myriad did for Angelina Jolie.) Myriad’s patents not only made testing for the gene very expensive, but rendered second opinions impossible. But on June 13, SCOTUS ruled in the AMP’s favor, saying that Myriad could not patent actual DNA, which is found in nature; the company could, however, patent cDNA (or complementary DNA), synthetically engineered clones of said genes.
The Myriad ruling applies almost solely to genetic research, however, which means that synthetic blood–which is similar to cDNA in that it’s created using biological templates–is still likely to be patented. “Regardless of this case, [the technology used to create] synthetic blood will likely be patented.”
If the SCRM’s synthetic blood tests are successful and it does get patented and sent to an international market, it’s likely that–unlike Tru Blood’s price, which seems to have been kept low despite its makers’ monopoly on the market–pricetags will be set high. (Want some precedent? Myriad Genetics’ patents allowed them to charge upwards of $3,000 for a test that would generically cost about $300.) That cost could seriously impact patients, especially in the U.S. and other nations with non-nationalized healthcare.
Ironically, if Benjamin’s research says anything, it’s that the people who can’t afford a medical breakthrough are often the people who secured its success in the first place.
“If a patient doesn’t have insurance, for example, you can imagine a doctor deciding, ‘Okay, do we use the synthetic blood that works faster and better [than real blood], and is more expensive? Or are we going to use the cheaper, real blood with this patient—if the patient doesn’t have insurance?’ It’s very likely doctors will choose the lower-quality product. The very same population who are the substrate for the research are, because of the class and racial dynamics, the same population denied access to it later because they’re uninsured.”
She says the Affordable Care Act may change this disparity, depending on what tests and treatments it covers.
Of course, none of this has yet come to pass in the case of the Scottish synthetic blood trials, since the trials themselves haven’t begun. Admittedly, there’s an outside chance that its potential success could still go the way of the polio vaccine; as its inventor Jonas Salk told a reporter in 1955 when asked who owned the patent for his discovery, “Well, the people, I would say. There is no patent. Could you patent the sun?”
But that magnanimity is unlikely in this day and age. As Benjamin points out, “The context in which synthetic blood is coming to market is a far cry from the days of Jonas Salk.”
After Supreme Court ruling, don't count out gene patenting quite yet
  • by Russell Brandom
  • June 14, 2013
The Supreme Court's Myriad Genetics case has been argued back and forth all year before the justices finally reached a decision this Wednesday. In the end, they came to something of a compromise: Human DNA cannot be patented, but synthetic DNA can. It's the same solution suggested by the president's solicitor general earlier this year, and seen by the court as the least painful way forward for everyone involved. Scientists can progress with research, and companies will still have some protection for early discoveries. Still, the decision has left many observers puzzled. Is this a victory for copyright? A triumph of science? An early blow to genetic patent trolling before it starts? The answer seems to be all of the above.
Is this a victory for copyright? A triumph of science?
So far, both sides of the debate are declaring victory. The narrow issue is Myriad Genetics claim on the BRCA genes, rare mutations that dramatically increase the risk of breast cancer. And while Myriad doesn't have claim on the genes anymore, their proprietary test has survived unscathed. The "synthetic DNA" clause in the ruling includes cDNA, the synthesized proteins used to check for the BRCA gene, so a crucial stage in the process is still under patent. In a statement after the ruling, Myriad reminded investors that the company has "more than 500 valid and enforceable claims in 24 different patents conferring strong patent protection for its BRACAnalysis® test." None of those are for DNA (some of them don’t involve genetics at all), but it’s enough to make sure that no one can copy Myriad’s DNA test outright. That said, the ruling will also make it easier for researchers to develop competing tests, since they can study the gene without risk of infringing on any patents.
At the same time, the coalition of plaintiffs, which included the Association of Molecular Biologists with legal help from the American Civil Liberties Union, is already celebrating. When we spoke to Dr. Haig Kazazian, a Johns Hopkins geneticists and one of the plaintiffs in the case, he could hardly contain himself. "I'm overjoyed that this came through," Kazazian told The Verge. "Losing on the cDNA is, to me, not a big deal. I would oppose it, but I don't think it's that big a deal."
"I'm overjoyed this came through."
Much of the confusion surrounding the results comes from the specifics of Myriad’s test, which works by synthesizing cDNA to test for a specific gene. That’s a good way to test for genes, but it’s not the only way, and more accurate options like small-scale sequencing may already be on the way. This new batch of tests would totally avoid cDNA and, more importantly, Myriad’s patents. It’s good news for breast cancer advocates, who worried that Myriad’s monopoly on the BRCA gene was driving prices up and stifling research on potentially life-saving tests.
That particular test is a hot-button issue, especially after Angelina Jolie's high-profile mastectomy earlier this year, the result of a positive BRCA test result. Still, some doctors are still worried that Myriad's head start in the field might stunt research on the genes. Dr. Kazazian believes Myriad still has a head start thanks to their immense private database of testing information. It's the best tool doctors have for discovering new variants of the gene and stopping more breast cancer cases early — and despite the ruling, it's still proprietary to Myriad.
"Most human gene sequence patents will be expired in the next five years anyway."
Beyond BRCA, the effects of the Supreme Court's decision are more nebulous. Michele Wales, a lawyer specializing in biotech patents, expects the broader implications to be fairly limited. "Most human gene sequence patents will be expired in the next 5 years anyway, so these monopolies are going away," Wales told The Verge. "Diagnostic companies often rely on method claims, so this decision won’t affect them much either." (Method patents give a company claim to a particular testing procedure, but leave the door open for other tests to be developed.) A broader ruling might have changed the sector more deeply, but the court chose a way forward that leaves most biotech patents the way they are.
And in many ways, the BRCA gene is an exception in modern science: a powerful mutation that's easy to test for, an anomaly among an otherwise cluttered and mysterious genome. Most genes aren’t as well understood, and they’re considerably more difficult to patent thanks to a legal standard called "non-obviousness." To get any patent, a company needs to show its invention is a step beyond the current state of the art. But because of recent advances in whole-genome sequencing, most recent gene patents haven’t been able to clear that standard, and the result is that very few patents like Myriad’s are granted in modern courts. In other words, the business of stockpiling gene patents is just an expensive sideshow — one that's thankfully drawing to a close.
The biggest surprise out of the ruling is the slight precedent the court has set for human genetics going forward. By dividing natural genes from synthetic genetic products, the court has set the stage for a host of future technologies, from GMO organs to man-made gene transplants. None of those technologies exist yet, but this week saw the first glimpses of how the law might treat them: patenting synthetic genes but leaving naturally occurring sequences untouched. If the ruling wasn't as definitive as advocates might have hoped, it's because the justices know this won't be the last time they're asked to litigate the building blocks of life.
This is fascinating:
Oxygen mystery: How marine mammals hold their breath
Scientists say they have solved the mystery of one of the most extreme adaptations in the animal kingdom: how marine mammals store enough oxygen to hold their breath for up to an hour.
The team studied myoglobin, an oxygen-storing protein in mammals' muscles and found that, in whales and seals, it has special "non-stick" properties.
This allowed the animals to pack huge amounts of oxygen into their muscles without "clogging them up".
The findings are published in Science.
Dr Michael Berenbrink from the Institute of Integrative Biology at the University of Liverpool took part in the study.
He said that scientists had long wondered how marine mammals managed to pack so much of this vital protein into their bodies.
"At high enough concentrations, [proteins] tend to stick together, so we tried to understand how seals and whales evolved higher and higher concentrations of this protein in their muscles without a loss of function," he told BBC News.

The team extracted pure myoglobin from the muscles of mammals - from the land-based cow, to the semi-aquatic otter, all the way up elite divers like the sperm whale.
Led by researcher Scott Mirceta, this painstaking examination traced the changes in myoglobin in deep-diving mammals through 200 million years of evolutionary history.
And it revealed that the best mammalian breath-holding divers had evolved a non-stick variety of myoglobin.
The secret, Dr Berenbrink explained, was a subtle but crucial piece of chemical trickery; marine mammal myoglobin is positively charged.

This has important physical consequences. Dr Berenbrink explained: "Like the similar poles of a magnet; the proteins repel one another."
"In this way we think the animals are able to pack really high concentrations of these proteins into their muscles and avoid them sticking together and clogging up the muscles."
Dr Berenbrink said he was excited by the discovery because it helped make sense of the incredible changes that took place in mammals' bodies as they evolved from land-based animals to the aquatic, air-breathing creatures that inhabit the oceans today.
It showed, he said, the physiological change that accompanied the land to water transition of mammals.
"It also allows us to estimate the dive times of the ancient ancestors of whales," Dr Berenbrink explained.
"We can look the fossils and predict the dive times they had."
Understanding exactly how mammals' bodies store oxygen so efficiently could also aid medical research.
Copying this bit of natural chemistry could aid the development of oxygen-carrying liquids that would deliver emergency supplies of oxygen to a person's tissues when a blood transfusion is not possible.
But its biggest impact will be in the realm of evolutionary biology.

Nicholas Pyenson, curator of fossil marine mammals at the Smithsonian Institution in Washington DC, said that the study was an exciting advancement for understanding the evolution of deep-diving.
"The idea that we can estimate maximal dive times for early diverging relatives of today's marine mammals will have a profound impact on how we think about their ancient ecology and biology," he told BBC News.
Professor Michael Fedak from the University of St Andrews' Sea Mammal Research Unit pointed out that myoglobin was only "part of the story" of how marine mammals were able to dive.
"But it's an important part," he said.
The scientist, who was not involved in this study, explained that a great deal of research at the moment was looking into how marine mammals manage to survive repeatedly cutting off and re-establishing the blood supply to their body tissues, something he likened to repeatedly suffering a crush injury.
"But being able to pick up a few [fossilised] bones of an extinct marine mammal and estimate its dive time from that - that's miraculous."
Ancient Roman Concrete Is About to Revolutionize Modern Architecture
By Bernhard Warner

June 14, 2013

After 2,000 years, a long-lost secret behind the creation of one of the world’s most durable man-made creations ever—Roman concrete—has finally been discovered by an international team of scientists, and it may have a significant impact on how we build cities of the future.
As anyone who’s ever visited Italy knows, the ancient Romans were master engineers. Their roads, aqueducts, and temples are still holding up remarkably well despite coming under siege over the centuries by waves of sacking marauders, mobs of tourists, and the occasional earthquake. One such structure that has fascinated geologists and engineers throughout the ages is the Roman harbor. Over the past decade, researchers from Italy and the U.S. have analyzed 11 harbors in the Mediterranean basin where, in many cases, 2,000-year-old (and sometimes older) headwaters constructed out of Roman concrete stand perfectly intact despite constant pounding by the sea.
The most common blend of modern concrete, known as Portland cement, a formulation in use for nearly 200 years, can’t come close to matching that track record, says Marie Jackson, a research engineer at the University of California at Berkeley who was part of the Roman concrete research team. “The maritime environment, in particular, is not good for Portland concrete. In seawater, it has a service life of less than 50 years. After that, it begins to erode,” Jackson says.
The researchers now know why ancient Roman concrete is so superior. They extracted from the floor of Italy’s Pozzuoili Bay, in the northern tip of the Bay of Naples, a sample of concrete headwater that dates back to 37 B.C. and analyzed its mineral components at research labs in Europe and the U.S., including at Berkeley Lab’s Advanced Light Source. The analysis, the scientists believe, reveals the lost recipe of Roman concrete, and it also points to how much more stable and less environmentally damaging it is than today’s blend.
That’s why the findings, which were published earlier this month in the Journal of the American Ceramic Society and American Mineralogist, are considered so important for today’s industrial engineers and the future of the world’s cities and ports. “The building industry has been searching for a way to make more durable concretes,” Jackson points out.
Another remarkable quality of Roman concrete is that its production was exceptionally green, a far cry from modern techniques. “It’s not that modern concrete isn’t good—it’s so good we use 19 billion tons of it a year,” says Paulo Monteiro, a research collaborator and professor of civil and environmental engineering at the University of California, Berkeley. “The problem is that manufacturing Portland cement accounts for 7 percent of the carbon dioxide that industry puts into the air.”
The secret to Roman concrete lies in its unique mineral formulation and production technique. As the researchers explain in a press release outlining their findings, “The Romans made concrete by mixing lime and volcanic rock. For underwater structures, lime and volcanic ash were mixed to form mortar, and this mortar and volcanic tuff were packed into wooden forms. The seawater instantly triggered a hot chemical reaction. The lime was hydrated—incorporating water molecules into its structure—and reacted with the ash to cement the whole mixture together.”
The Portland cement formula crucially lacks the lyme and volcanic ash mixture. As a result, it doesn’t bind quite as well when compared with the Roman concrete, researchers found. It is this inferior binding property that explains why structures made of Portland cement tend to weaken and crack after a few decades of use, Jackson says.
Adopting the materials (more volcanic ash) and production techniques of ancient Roman could revolutionize today’s building industry with a sturdier, less CO2-intensive concrete. “The question remains, can we translate the priciniples from ancient Rome to the production of modern concrete? I think that is what is so exciting about this new area of research,” Jackson says.
Of course, if you are no fan of concrete architecture, you’re out of luck. It could be with us for a few millenia more.
Despite the fact that the rather experimental fecal transplant procedure has been recognized through a number of studies as a reliable cure for the rare and debilitating disease known as Clostridium difficile, or C. diff, FDA regulations made it difficult for doctors to reliably perform the operation. However, now the organization says that it will "exercise enforcement discretion" and no longer require doctors to receive agency approval before performing the procedure. Earlier this year, the FDA introduced a policy that classified fecal transplants as an "investigational new drug," which meant that doctors needed to submit an extensive application to the agency and wait up to 30 days for a response — as a result, many doctors chose not to perform the procedure at all.
Fortunately for the 3 million people affected by C. diff in the US every year, the FDA is changing course. Thanks to the doctors who protested this new regulation, the FDA now says that it won't require doctors to receive its approval before performing the fecal transplant procedure. However, the agency also notes that it is only lifting that requirement when the fecal transplant is performed on C. diff patients who have not responded to other treatment methods; the patient must also have a discussion with the doctor about "potential risks" and acknowledge that the treatment is "investigational." But while the FDA hasn't fully embraced fecal transplants, its at least making things easier for those that might benefit from the treatment.
NOAA, partners predict possible record-setting dead zone for Gulf of Mexico
Also anticipating smaller hypoxia levels than in past in Chesapeake Bay

June 18, 2013

Less oxygen dissolved in the water is often referred to as a “dead zone” (in red above) because most marine life either dies, or, if they are mobile such as fish, leave the area. Habitats that would normally be teeming with life become, essentially, biological deserts.
Download image here. (Credit: NOAA)
Click to watch Dead Zone video.
Scientists are expecting a very large “dead zone” in the Gulf of Mexico and a smaller than average hypoxic level in the Chesapeake Bay this year, based on several NOAA-supported forecast models.
NOAA-supported modelers at the University of Michigan, Louisiana State University, and the Louisiana Universities Marine Consortium are forecasting that this year’s Gulf of Mexico hypoxic “dead” zone will be between 7,286 and 8,561 square miles which could place it among the ten largest recorded. That would range from an area the size of Connecticut, Rhode Island and the District of Columbia combined on the low end to the New Jersey on the upper end. The high estimate would exceed the largest ever reported 8,481 square miles in 2002 .
Hypoxic (very low oxygen) and anoxic (no oxygen) zones are caused by excessive nutrient pollution, often from human activities such as agriculture, which results in insufficient oxygen to support most marine life in near-bottom waters. Aspects of weather, including wind speed, wind direction, precipitation and temperature, also impact the size of dead zones.
The Gulf estimate is based on the assumption of no significant tropical storms in the two weeks preceding or during the official measurement survey cruise scheduled from July 25-August 3 2013. If a storm does occur the size estimate could drop to a low of 5344 square miles, slightly smaller than the size of Connecticut.
This year’s prediction for the Gulf reflect flood conditions in the Midwest that caused large amounts of nutrients to be transported from the Mississippi watershed to the Gulf. Last year’s dead zone in the Gulf of Mexico was the fourth smallest on record due to drought conditions, covering an area of approximately 2,889 square miles, an area slightly larger than the state of Delaware. The overall average between 1995-2012 is 5,960 square miles, an area about the size of Connecticut.
A second NOAA-funded forecast, for the Chesapeake Bay, calls for a smaller than average dead zone in the nation's largest estuary. The forecasts from researchers at the University of Maryland Center for Environmental Science and the University of Michigan has three parts: a prediction for the mid-summer volume of the low-oxygen hypoxic zone, one for the mid-summer oxygen-free anoxic zone, and a third that is an average value for the entire summer season.
The forecasts call for a mid-summer hypoxic zone of 1.46 cubic miles, a mid-summer anoxic zone of 0.26 to 0.38 cubic miles, and a summer average hypoxia of 1.108 cubic miles, all at the low end of previously recorded zones. Last year the final mid-summer hypoxic zone was 1.45 cubic miles.
This is the seventh year for the Bay outlook which, because of the shallow nature of large areas of the estuary, focuses on water volume or cubic miles, instead of square mileage as used in the Gulf. The history of hypoxia in the Chesapeake Bay since 1985 can be found at the EcoCheck website.
Both forecasts are based on nutrient run-off and river stream data from the U.S. Geological Survey (USGS), with the Chesapeake data funded with a cooperative agreement between USGS and the Maryland Department of Natural Resources. Those numbers are then inserted into models developed by funding from the National Ocean Service’s National Centers for Coastal Ocean Science (NCCOS).
"Monitoring the health and vitality of our nation’s oceans, waterways, and watersheds is critical as we work to preserve and protect coastal ecosystems,” said Kathryn D. Sullivan, Ph.D., acting under secretary of commerce for oceans and atmosphere and acting NOAA administrator. “These ecological forecasts are good examples of the critical environmental intelligence products and tools that help shape a healthier coast, one that is so inextricably linked to the vitality of our communities and our livelihoods.”
The dead zone in the Gulf of Mexico affects nationally important commercial and recreational fisheries, and threatens the region’s economy. The Chesapeake dead zones, which have been highly variable in recent years, threaten a multi-year effort to restore the Bay’s water quality and enhance its production of crabs, oysters, and other important fisheries.
During May 2013, stream flows in the Mississippi and Atchafalaya rivers were above normal resulting in more nutrients flowing into the Gulf. According to USGS estimates, 153,000 metric tons of nutrients flowed down the rivers to the northern Gulf of Mexico in May, an increase of 94,900 metric tons over last year’s 58,100 metric tons, when the region was suffering through drought. The 2013 input is an increase of 16 percent above the average nutrient load estimated over the past 34 years.

For the Chesapeake Bay, USGS estimates 36,600 metric tons of nutrients entered the estuary from the Susquehanna and Potomac rivers between January and May, which is 30 percent below the average loads estimated from1990 to 2013.
“Long-term nutrient monitoring and modeling is key to tracking how nutrient conditions are changing in response to floods and droughts and nutrient management actions,” said Lori Caramanian, deputy assistant secretary of the interior for water and science. “Understanding the sources and transport of nutrients is key to developing effective nutrient management strategies needed to reduce the size of hypoxia zones in the Gulf, Bay and other U.S. waters where hypoxia is an on-going problem.”
“Coastal hypoxia is proliferating around the world,” said Donald Boesch, Ph.D., president of the University of Maryland Center for Environmental Science. “It is important that we have excellent abilities to predict and control the largest dead zones in the United States. The whole world is watching.”
The confirmed size of the 2013 Gulf hypoxic zone will be released in August, following a monitoring survey led by the Louisiana Universities Marine Consortium beginning in late July,and the result will be used to improve future forecasts. The final measurement in the Chesapeake will come in October following surveys by the Chesapeake Bay Program’s partners from the Maryland Department of Natural Resources and the Virginia Department of Environmental Quality.
Despite the Mississippi River/Gulf of Mexico Nutrient Task Force’s goal to reduce the dead zone to less than 2,000 square miles, it has averaged 5,600 square miles over the last five years. Demonstrating the link between the dead zone and nutrients from the Mississippi River, this annual forecast continues to provide guidance to federal and state agencies as they work on the 11 implementation actions outlined by the Task Force in 2008 for mitigating nutrient pollution.
NOAA’s National Ocean Service has been funding investigations and forecast development for the dead zone in the Gulf of Mexico since 1990, and oversees national hypoxia research programs which include the Chesapeake Bay and other affected bodies of water.
USGS operates more than 3,000 real-time stream gages and collects water quality data at numerous long-term stations throughout the Mississippi River basin and the Chesapeake Bay to track how nutrient loads are changing over time.
The National Centers for Coastal Ocean Science is the coastal science office for NOAA’s National Ocean Service. Visit our website or follow our blog to read more about NCCOS research.
USGS provides science for a changing world. Visit, and follow us on Twitter @USGS and our other social media channels at
NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on Facebook , Twitter and our other social media channels.
Teaching Complete Evolutionary Stories Increases Learning
June 15, 2013 — Many students have difficulty understanding and explaining how evolution operates. In search of better ways to teach the subject, researchers at Michigan State University developed complete evolutionary case studies spanning the gamut from the molecular changes underlying an evolving characteristic to their genetic consequences and effects in populations. The researchers, Peter J. T. White, Merle K. Heidemann, and James J. Smith, then incorporated two of the scenarios into a cellular and molecular biology course taught to undergraduates at the university's Lyman Briggs College.

When the students' understanding was tested, the results showed that students who had understood an integrated evolutionary scenario were better at explaining and describing how evolution works in general.
The results of the research, described in the July issue of BioScience, are significant because evolution is not usually taught in this comprehensive, soup-to-nuts way. Rather, instructors teach examples of parts of the evolutionary process, such as the ecological effects of natural selection or the rules of genetic inheritance, separately. It appears that this fragmentation makes it harder for students to understand the process as a whole.
White and his colleagues note that "surprisingly few" comprehensive evolutionary study systems have been described, although the number is growing. The two employed in the BioScience study were about the evolution of sweet taste and wrinkled skin in domestic garden peas, and the evolution of light or dark coat color in beach mice living on light or dark sand. Students were tested on the beach mouse coat color scenario as well as on evolutionary principles in general. Understanding the beach mouse example was a better predictor of good responses to questions about evolution in general than was performance on the course as a whole. This suggests that improvements in evolutionary understanding came mostly from studying the integrated evolution scenarios.

Users who are viewing this thread

Top Bottom