Harvard cracks DNA storage, crams 700 terabytes of data into a single gram

From extremetech.com

A bioengineer and geneticist at Harvard’s Wyss Institute have successfully stored 5.5 petabits of data— around 700 terabytes — in a single gram of DNA, smashing the previous DNA data density record by a thousand times.

The work, carried out by George Church and Sri Kosuri, basically treats DNA as just another digital storage device. Instead of binary data being encoded as magnetic regions on a hard drive platter, strands of DNA that store 96 bits are synthesized, with each of the bases (TGAC) representing a binary value (T and G = 1, A and C = 0).

To read the data stored in DNA, you simply sequence it — just as if you were sequencing the human genome — and convert each of the TGAC bases back into binary. To aid with sequencing, each strand of DNA has a 19-bit address block at the start (the red bits in the image below) — so a whole vat of DNA can be sequenced out of order, and then sorted into usable data using the addresses.

Scientists have been eyeing up DNA as a potential storage medium for a long time, for three very good reasons: It’s incredibly dense (you can store one bit per base, and a base is only a few atoms large); it’s volumetric (beaker) rather than planar (hard disk); and it’s incredibly stable — where other bleeding-edge storage mediums need to be kept in sub-zero vacuums, DNA can survive for hundreds of thousands of years in a box in your garage.

It is only with recent advances in microfluidics and labs-on-a-chip that synthesizing and sequencing DNA has become an everyday task, though. While it took years for the original Human Genome Project to analyze a single human genome (some 3 billion DNA base pairs), modern lab equipment with microfluidic chips can do it in hours. Now this isn’t to say that Church and Kosuri’s DNA storage is fast — but it’s fast enough for very-long-term archival.

Just think about it for a moment: One gram of DNA can store 700 terabytes of data. That’s 14,000 50-gigabyte Blu-ray discs… in a droplet of DNA that would fit on the tip of your pinky. To store the same kind of data on hard drives — the densest storage medium in use today — you’d need 233 3TB drives, weighing a total of 151 kilos. In Church and Kosuri’s case, they have successfully stored around 700 kilobytes of data in DNA — Church’s latest book, in fact — and proceeded to make 70 billion copies (which they claim, jokingly, makes it the best-selling book of all time!) totaling 44 petabytes of data stored.

Looking forward, they foresee a world where biological storage would allow us to record anything and everything without reservation. Today, we wouldn’t dream of blanketing every square meter of Earth with cameras, and recording every moment for all eternity/human posterity — we simply don’t have the storage capacity. There is a reason that backed up data is usually only kept for a few weeks or months — it just isn’t feasible to have warehouses full of hard drives, which could fail at any time. If the entirety of human knowledge — every book, uttered word, and funny cat video — can be stored in a few hundred kilos of DNA, though… well, it might just be possible to record everything (hello, police state!)

It’s also worth noting that it’s possible to store data in the DNA of living cells — though only for a short time. Storing data in your skin would be a fantastic way of transferring data securely…

Advertisements

Human immortality could be possible by 2045, say Russian scientists

by Lauren O’Neil from cbc.ca

If Dmitry Itskov’s 2045 initiative plays out as planned, humans will have the option of living forever with the help of machines in only 33 years.

It may sound ridiculous, but the 31-year-old Russian mogul is dead serious about neuroscience, android robotics, and cybernetic immortality.

He has already pulled together a team of leading Russian scientists intent on creating fully functional holographic human avatars that house artificial brains which contain a person’s complete consciousness – in other words, a humanoid robot.

Together, they’ve laid out an ambitious course of action that would see the team transplant a human brain into an artificial body (or ‘avatar’) in as little as seven years time.

Now, Itskov is asking the world’s richest people for help in financing the project.

In exchange, he’s offered to coordinate their own personal immortality projects for free.

“I urge you to take note of the vital importance of funding scientific development in the field of cybernetic immortality and the artificial body,” he writes in an open letter to members of the Forbes World’s Billionaires List.

“Such research has the potential to free you, as well as the majority of all people on our planet, from disease, old age and even death.”

Itskov goes on to offer skeptics a meeting with “a team of the world’s leading scientists working in this field ” to prove the viability of the concept of cybernetic immortality.

And while many are skeptical that such a plan could ever come to fruition, Popular
Science Magazine points that phase one — creating a robot controlled by a human brain — is already well within reach.

“DARPA is already working on it via a program called “Avatar” (which, incidentally, is also the name of Itskov’s project) through which the Pentagon hopes to create a brain-machine interface that will allow soldiers to control bipedal human surrogate machines remotely with their minds,” writes PopSci’s Clay Dillow.

“And of course there are all the ongoing medical prosthesis projects that have shown that the human nervous system can interface with prosthetic enhancements, manipulating them via thought. Itskov draws a clear arc from what we have now to the consciousness-containing holograms that he envisions. All we have to do is attack the technological obstacles in between, one at a time, until we get there.”

Discovery’s Alyssa Danigelis takes an opposing stance to the very idea.

“There’s a world of difference between pursuing a brain-controlled exoskeleton to help paraplegics regain control and wanting to essentially upload a human brain into an artificial body,” she writes.

“I read a sci-fi novel involving disembodied live brains once. It didn’t turn out well”

ORIGINAL ARTICLE HERE


Arthur C. Clarke Predicts the Internet and the PC

In 1974 Arthur C. Clarke told the ABC that every household in 2001 will have a computer and be connected all over the world. Courtesy of Australian Broadcasting Corporation.


Well… this guy is awesome

Jason Silva. Futurist… and self proclaimed Ecstatic Truth Lover and Techno Optimist.

More at http://vimeo.com/jasonsilva


How does SOPA affect Canada and what can we do about it?

For Canadians wanting to learn about the US bill SOPA and PIPA, you can read this blog:

http://www.michaelgeist.ca/content/view/6244/125/

Take a listen to the CBC radio show link on there, and you will understand much more about the effects we will experience here in Canada from the passage of this US legislation. You will also find links to the Canadian counterparts of the bills and ways you can help oppose them.

Do you want access to the whole internet? Or only part of it?


Scientists Reconstruct Brains’ Visions Into Digital Video In Historic Experiment

UC Berkeley scientists have developed a system to capture visual activity in human brains and reconstruct it as digital video clips. Eventually, this process will allow you to record and reconstruct your own dreams on a computer screen.

I just can’t believe this is happening for real, but according to Professor Jack Gallant—UC Berkeley neuroscientist and coauthor of the research published today in the journal Current Biology—”this is a major leap toward reconstructing internal imagery. We are opening a window into the movies in our minds.”

Indeed, it’s mindblowing. I’m simultaneously excited and terrified. This is how it works:
They used three different subjects for the experiments—incidentally, they were part of the research team because it requires being inside a functional Magnetic Resonance Imaging system for hours at a time. The subjects were exposed to two different groups of Hollywood movie trailers as the fMRI system recorded the brain’s blood flow through their brains’ visual cortex.

The readings were fed into a computer program in which they were divided into three-dimensional pixels units called voxels (volumetric pixels). This process effectively decodes the brain signals generated by moving pictures, connecting the shape and motion information from the movies to specific brain actions. As the sessions progressed, the computer learned more and more about how the visual activity presented on the screen corresponded to the brain activity.

An 18-million-second picture palette

After recording this information, another group of clips was used to reconstruct the videos shown to the subjects. The computer analyzed 18 million seconds of random YouTube video, building a database of potential brain activity for each clip. From all these videos, the software picked the one hundred clips that caused a brain activity more similar to the ones the subject watched, combining them into one final movie. Although the resulting fukn is low resolution and blurry, it clearly matched the actual clips watched by the subjects.

Think about those 18 million seconds of random videos as a painter’s color palette. A painter sees a red rose in real life and tries to reproduce the color using the different kinds of reds available in his palette, combining them to match what he’s seeing. The software is the painter and the 18 million seconds of random video is its color palette. It analyzes how the brain reacts to certain stimuli, compares it to the brain reactions to the 18-million-second palette, and picks what more closely matches those brain reactions. Then it combines the clips into a new one that matches what the subject was seeing. Notice that the 18 million seconds of motion video are not what the subject is seeing. They are random bits used just to compose the brain image.

Given a big enough database of video material and enough computing power, the system would be able to match any images in your brain.

In this other video you can see how this process worked in the three experimental targets. On the top left square you can see the movie the subjects were watching while they were in the fMRI machine. Right below you can see the movie “extracted” from their brain activity. It shows that this technique gives consistent results independent of what’s being watched—or who’s watching. The three lines of clips next to the left column show the random movies that the computer program used to reconstruct the visual information.

Right now, the resulting quality is not good, but the potential is enormous. Lead research author—and one of the lab test bunnies—Shinji Nishimoto thinks this is the first step to tap directly into what our brain sees and imagines:

Our natural visual experience is like watching a movie. In order for this technology to have wide applicability, we must understand how the brain processes these dynamic visual experiences.

The brain recorders of the future

Imagine that. Capturing your visual memories, your dreams, the wild ramblings of your imagination into a video that you and others can watch with your own eyes.

This is the first time in history that we have been able to decode brain activity and reconstruct motion pictures in a computer screen. The path that this research opens boggles the mind. It reminds me of Brainstorm, the cult movie in which a group of scientists lead by Christopher Walken develops a machine capable of recording the five senses of a human being and then play them back into the brain itself.

This new development brings us closer to that goal which, I have no doubt, will happen at one point. Given the exponential increase in computing power and our understanding of human biology, I think this will arrive sooner than most mortals expect. Perhaps one day you would be able to go to sleep with a flexible band around your skull labeled Sony Dreamcam, wirelessly connected to your iPad 7.

Original article from gizmodo can be found here.

UC Berkeley article


Floating city, libertarian colonies close to reality

Article by Marianne English

Cities come in all shapes and sizes, but can they float?

PayPal founder Peter Thiel recently invested $1.25 million to try to make that happen. By offering support to the Seasteading Institute, he has jump-started efforts to create city-state communities afloat on ocean platforms. In 2009, the institute held a 3D design competition to help people visualize what these sea-bound communities may look like (see photos above and below).

The structures will be 12,000 tons, diesel-powered and carry around 270 people per unit, as reported by Details Magazine. The idea is to link the structures together to create ocean metropolises equipped to accommodate millions of people.

But here’s the kicker: Thiel hopes to place the structures beyond the coastal jurisdiction of existing countries and their laws. Call it a libertarian pipe dream if you will, but he hopes to experiment with different types of government styles. In some cases, he wishes to start up city-states without minimum wages and welfare, with fewer weapons restrictions and “looser” building codes, he says. Seasteading already has a team working on the legal issues involved.

Floating pilots, which will include office buildings, will be placed off the coast of San Francisco as early as next year.

Read full article here.