We had decapsulated the A5 a couple of days ago, but as you could see in those early pictures, you can’t tell much of a chip’s layout from the top metal – it’s all power and ground buses. So we have to de-layer the chip down to a level where we can see the block layout of the chip; not an easy thing when there’s nine layers of metal! In fact, these days it’s easier to go in from the back and remove the substrate silicon, and look at the gate level from below. Then we can identify the circuit blocks that make up the full device.
Just a quick post before I gotto bed, the iPod Nano hasn’t been “jailbroken” as some sites claim, I do not have root access over the device. I did not “install” an app. I figured out how to remove them and insert a blank space into the springboard.
What I have also done is figured out a way for the iPod to boot with modified files (eg the SpingBoard Plist), bypassing the procedure it takes to stop this, I hope this will allow us to figure out a way to jailbreak it. I am primarily focusing on exposing some of the (for now) hidden features of the device.
The hack is simple. It may lead to greater things. I just don’t want people getting their hopes up that’s it’s jailbroken just yet or what I have done to be blown out of proportion.
I’ll write up more tomorrow. Any questions, contact me on twitter: @jwhelton
I’ve emailed Steve three times asking for an iPod Nano SDK! He’s got the emails. I know he’s got the emails. But he must have been too busy to get back to me!
Consequently I am very much looking forward to more revelations from Mr. Whelton. In the meantime here’s a video…
An international group of scientists are aiming to create a simulator that can replicate everything happening on Earth – from global weather patterns and the spread of diseases to international financial transactions or congestion on Milton Keynes’ roads.
Nicknamed the Living Earth Simulator (LES), the project aims to advance the scientific understanding of what is taking place on the planet, encapsulating the human actions that shape societies and the environmental forces that define the physical world.
Generating the computational power to deal with the amount of data needed to populate the LES represents a significant challenge, but it’s far from being a showstopper.
If you look at the data-processing capacity of Google, it’s clear that the LES won’t be held back by processing capacity, says Pete Warden, founder of the OpenHeatMap project and a specialist on data analysis.
While Google is somewhat secretive about the amount of data it can process, in May 2010 it was believed to use in the region of 39,000 servers to process an exabyte of data per month – that’s enough data to fill 2 billion CDs every month.
A fascinating, and supremely ambitious project.
I would love to be involved on the imaging side of things.
At the risk of making light of the scope of something like this, I wonder if they’ve considered speaking to the “mice” that actually run the supercomputer that is Earth. Ultimately of course, thanks to Douglas Adams, we all know the final result of any simulation like this will yield the answer 42.
When [Andrew] Carol’s not working on improving the finer points of OS X as a software engineer at Apple, he’s hard at work building analog computers — like the Babbage difference engine – entirely out of Legos.
Recently, Carol has completed his biggest challenge yet: a working Lego replica of the famous Antikythera Mechanism, created by ancient Greeks in 100 B.C. as a way of predicting astronomical events like eclipses.
If there were a Mt. Rushmore of computer gaming, John Carmack’s head would not only be on it, it would have the highest polygon count.
An interview of particular interest to those wanting, thinking about designing, or working on more exotic and ambitious titles for iOS.
Carmack also ran through id’s decision process on pricing…
Every release that we’ve done on here has been an experiment with price point, and with different strategies. So far, we’ve had the most commercial success with Doom: Resurrection, which launched at $9.95. But we don’t have enough datapoints to really draw conclusions from this. We had great success with Wolfenstein Classic and Doom Classic, but they’re sort of riding the nostalgia buzz. So they can’t necessarily be evaluated in isolation.
With Rage, we intentionally went with a much lower initial price-point, because to some degree this is marketing and promotion for the big title.
So in that sense I get too much of a feeling overall that Rage is a piece of marketing for the Rage franchise as a whole. And I am disappointed that I didn’t get to have a true FPS experience in a mobile Rage sandbox like I was expecting.
Oliver Kreylos has done what many thought were impossible – using multiple Kinects to capture different angles of an object simultaneously. Interference between the different IR-dot grids is a problem, but it is a much smaller problem than expected.
Holographic idol Hatsune Miku is the creation of the group Crypton Future Media, using software from Vocaloid, and the group has put the avatar on tour with a live band. The sight of thousands of screaming fans waving glow sticks while the the holograph “performs” on stage is straight out of a science fiction novel.
Thanks to qandrew, for the heads up. Amazing video. I’d love to go to a concert.
Apparently as part of the Japanese bid for the 2022 (Football) World Cup was the offer of projecting entire live football matches into stadiums around the world. Not my cup of cha, but I can imagine if you like football and can’t travel to the actual match then that might get you your fix of atmosphere, while almost seeing the same thing as those who made the journey.
One day perhaps we’ll be able to get table versions of these projectors so that we can have Formula One races projected onto our coffee tables like virtual Scalectrixs tracks!
[T]he killer demo is the telepresence. Obviously, there are safety issues involved with exposing subjects to lasers, so the imaging was done with regular cameras—16 of them, all using Firewire to provide something close to real-time performance. These were sent across ethernet to the display computer; the authors indicate that even 100Mbps has plenty of capacity to spare. At the receiving end, a desktop class computer reconstructed these into a 3D image, and used that to control the laser that encoded the image into the display media.
We’re not quite ready to see Princess Leia emotionally plead for help, given that the refresh rates are still a couple of seconds between frames. But the work at least demonstrates that the general approach is flexible enough to handle both long-lived displays and relatively rapid refresh, so it’s possible that a bit of further tweaking would improve its performance for one or the other of these, or get it to do something entirely new.
Cool technology. Looking forward to more news on this in the future.
I still remember when I thought my Newton was the most powerful thing I would ever hold in my hands. Or ever need, for that matter. Admittedly I was in the first flush of un-boxing on launch day, at the time.
It’s crazy to think that in a few years time we’ll probably be going similarly gaga over PS3 and current gen. iPhone games running via an emulator on our iPod Nanos.