Tagged: tech

on Apple in the post-device world

Apple has had their Icarus moment and it’s not losing Steve Jobs—though that may also prove problematic. Their Icarus moment is the inability to actually deliver on the promises of iCloud. They are now, and have always been, a device company and they are about to enter the post-device world. Try as they might, they can’t seem to execute on a strategy that puts the device second to anything else.

Let’s step back for a minute and think about where technology is heading in the next 5-10 years. It hasn’t even been 5 years since the iPhone came out and effectively launched the smartphone and in the process started us down the path to the post-PC world. We’re pretty much there at this point, but it doesn’t end there.

The next logical step is the post-device world where the actual device you use to access your data and apps is mostly irrelevant. It’s just a screen and input mechanism. Sure, things will have to be customized to fit screens of different sizes and input mechanisms willy vary, but basically all devices will be thin clients. They’ll reach out to touch (and maybe cache) your data in the cloud and any heavy computational lifting will be done somewhere else (as is already done with voice-to-text today).

The device you use more or less will not matter. As long as it has a halfway-decent display, a not shit keyboard, some cheap flash storage for a cache of some data, the barest minimum of a CPU and a wireless NIC, you’re good.

This world is not Apple’s forté. Not only is is nearly all of their profit from exactly the devices that will not matter, but they’re not very good at the seamless syncing between devices either. It took them until iOS 5 to provide native syncing of contacts, calendars and the like directly to and from the cloud. After Android, Palm’s webOS and even comically-late-to-the-party Windows Phone had implemented it.

Moreover, this is not the first time Apple has tried to provide some kind of cloud service. They started with iTools in 2000, then .Mac in 2002, MobileMe in 2008, iWork.com in 2009 and now they’re on iCloud. None of the previous incarnations have been what anyone would call a resounding success. In at least one case, it was bad enough that Steve Jobs asked “So why the fuck doesn’t it do that?”

So, who will succeed well in this post-device world? The obvious answer might be Google since they’re already more or less there by having all of their apps be browser-based, but I’m not totally convinced. They seem to be struggling to provide uniform interfaces to their apps across devices and that seems hey here. For instance, the iconography of my gmail is different from my browser than it is on my Android tablet and that’s for a device they own.

Actually, in a perverse way, I think Microsoft might really have what it takes to succeed in this world if they can execute. They have a long history of managing to maintain similar interfaces and design languages across different platforms and devices. Though, their failure to provide a clean Metro-based interface in Windows 8 is a bit of a damper for their chances.

self-healing polyurethane

Following on the self-healing rubber from a few months ago, new work (by different people) has made self healing polyurethane coatings.

The secret of the material lies in using molecules made from chitosan, which is derived from the shells of crabs and other crustaceans.

In the event of a scratch, ultraviolet light drives a chemical reaction that patches the damage.

The work by University of Southern Mississippi researchers is reported in the journal Science.

They designed molecules joining ring-shaped molecules called oxetane with chitosan.

The custom-made molecules were added to a standard mix of polyurethane, a popular varnishing material that is also used in products ranging from soft furnishings to swimsuits.

Scratches or damage to the polyurethane coat split the oxetane rings, revealing loose ends that are highly likely to chemically react.

In the ultraviolet light provided by the sun, the chitosan molecules split in two, joining to the oxetane’s reactive ends.

Cool stuff, though I wonder how many bathing suits you need to replace before you make up the cost difference between your self-healing one and a normal one.

on artificial eyes

http://news.bbc.co.uk/2/hi/health/7919645.stm

The BBC posted this article describing a 73-year-old man who has been blind for nearly 30 years and is getting his sight back—or something that vaguely resembles sight and is a whole hell of a lot better than not seeing.

He says he can now follow white lines on the road, and even sort socks, using the bionic eye, known as Argus II.

That’s actually pretty impressive given that most of the previous stuff I’ve read here really only involves people seeing colors and lights. There’s two cool videos, one’s an interview with the actual guy using the eye and another explains the basics of how it works, though at a level that you could have probably figured out on your own.

It’s not quite to the point where I can have a chip in my optic nerve which gives me the augmented reality that I’ve wanted since I was about 10, but it’s still damn cool and a really impressive first step. Maybe I’ll get it my wish in my lifetime.

on speech recognition

http://www.sciam.com/podcast/episode.cfm?id=thinking-of-human-as-machine-09-02-24

This is the first idea about speech recognition that sounded right in a long time. The idea is to try to understand how it is that the human brain picks up speech and decodes it as guidelines for how we might make computers do the same thing.

While this short snippet is short of details it mentions the idea that different neurons respond to different frequencies. I have no idea how state-of-the-art speech recognition is done these days, but I bet there’s a lot of things that we can learn from seeing how the brain does it. The premise that the researcher in the above link is working on is that it’s a more mechanical process in the brain than we think and that maybe we can leverage that.

Kind of cool. Makes me wonder if we might eventually get this stuff to work after all.

neurons on silicon

http://news.bbc.co.uk/2/hi/uk_news/scotland/edinburgh_and_east/7867724.stm

Edinburgh University has developed a technique, which allows neurons to grow in fine, detailed patterns on the surface of tiny computer chips.

Cool stuff. I mean it’s really just at the very beginning, but the idea that we can control how human tissues grow over the same materials that we use to create our electronics is a huge step forward if we eventually want to be able to use electronics to help repair, replace and augment our own bodies.

I’ve been saying for years that I want a chip in my optic nerve that gives me a heads-up display without any of the focusing and perspective issues. It’s not there yet, but everything happens in baby steps.

on my bluetooth headset

I’ve long opposed bluetooth headsets for a bunch of reasons including that the sound quality sucks and that they make you look like an idiot. I still stand by both of those, but Seattle just passed a cell phone law requiring that you use a hands-free device be used to talk while driving. so I got a headset.

The fact that it’s the conversation—not the fact that your hand is busy—which distracts you from driving thus making this law silly is a topic for another post.

Overall I’ve been pleased with the headset, and it provides a bunch of things which I wouldn’t have expected. I can keep both hands in my pockets while talking to people while I’m walking outside enabling me to keep them warm. The volume can be made significantly louder than my phone’s was originally, which makes talking in airports and busy streets possible, though still not fun. I can leave the phone wherever it gets best reception (on top of the spare desktop at the back of the desk by the window) on my desk at work while freeing me to move around at my desk.

That being said, the biggest annoyance I have that I didn’t anticipate is that I’m constantly trying to answer the phone with the actual phone only to find that audio has been rerouted through the headset. Then, I need to rummage around to find it while the person on the other end is wondering what the hell is going on.

I don’t think there’s any solution for me other than to try and remember whether the bluetooth headset is on and enabled or not, but I was thinking about how it could be made better. My first thought was to make it so that if I started a call using the handset rather than the headset, it should disable the headset for that call, which works beautifully for incoming calls, but since you can’t initiate outgoing calls on my headset, that won’t work for them.

My second thought, which I think has more legs, is that the phone should figure out whether or not your holding the handset to your ear. If you are, you obviously want to use the handset, not the headset. This stupid simple solution seems like it would work every time and I’m 99% sure the iPhone already has the technology to do this in it because I think it turns off the backlight (and locks the touchscreen?) whenever the phone is to your ear.

You could extend this so that the headset also knew whether it was on your ear, so that the phone knew to send the audio there when it was right as well, but that seems like it shouldn’t be that necessary.