Using the nucleobases (G, A, T, C), super sciency scientists have done some sciency science and found a way to use them to encode binary! Good News!: 700 terabytes per gram of DNA! Long lasting! 99.99-100% data integrity! Bad news!: $12,400 per megabyte to encode data! $220 per megabyte to decode data! Ten years before feasible commercial usage (just like nuclear fusion)! http://www.extremetech.com/extreme/...rams-700-terabytes-of-data-into-a-single-gram http://en.wikipedia.org/wiki/DNA_digital_data_storage Discuss!
I've known about them wanting to do this for a while, but I've never seen the cost numbers before. My biggest question is how soon would it be able to read the data quickly? Storing lots of data certainly has its uses, but if you can't access it I don't see it being used for anything but massive databases that don't get constant use. Like tax records from 40 years ago.
We already have fast-access stuff, solid state, but it would be nice to have massive data storage, like a hundred thousand terabytes in the size of a modern day hard drive.
It doesn't need to be faster than our fastest, it just needs to be fast enough to use. I don't want to wait 4 hours to decode a movie I stored so I can run it on my comp, let alone access massive databases to pull values. Which I'm assuming is the intended purpose.
No, it probably isn't intended first just for your entertainment. First thing to mind is how many pirated movies you can download to it lol... No, the first use is like he said, storing patient records of people who died 40 years ago or something. You can pull anything back up forever after this becomes used, as long as you have a day's heads up and it isn't too large just in itself. But storing thousands of terabytes itself, isn't that fascinating. You could store a copy of ever piece of literature every created in a single room.
I meant the intended use was massive databases, not the movie bit. A movie I legitimately download as part of my bluray package is about 20 gigs. That is way, way smaller than the tax records of every person in the united states for a decade, and would be used significantly less. It would take less time, cost less, and be generally more useful to have those files in a normal storage system until the DNA storage improves in terms of speed to a usable level. If it took a day for every pull on a database that gets pulled 20 times a day, that backs up super duper fast.
http://physicsworld.com/cws/article...ystal-heralds-unlimited-lifetime-data-storage I'm still of the opinion that this has the greater long term potential - one, they can already do it. It's just a matter of continuing to bring the price down and speed up. 8mbit/sec is still really slow on the write speed when you're talking hundreds of terabytes. On the plus side, read speed is MUCH faster, so for data that doesn't change, this is still a great option. I could see a situation where the two could be combined for the best of medium term and long term storage.
For someone using computers as art studios or photography assistants, portability is key. The last thing a working photographer wants is a bulky hard drive in the field, and when transporting digital art from one location to another, you can’t always rely on your network.
Hmm, maybe not. At an estimated 1.2 zettabytes, that would be roughly 3532 pounds. Most cars might struggle a bit with that. However, a good sized truck should be ok. ;-)
According to this, it's four zettabytes. That's 4294967296 terabytes. Divide by 700 is ~6135668 grams. Which is ~6 metric tons (woops I remembered it wrong). Maybe you can't drive it around in your car but your average man's chinook would have no issues.
Ah, I just did a quick search for "how much data on the internet". But it looks like it was written in about 2010, which.. sounds right. It's commonly held that the amount of data is increasing on a logarithmic scale. Which is kinda nuts.