Perhaps another alternative to Google Video upload, or is this more like the photo hosting sites? I’d like to find a way to get my personal media data closer to the internet backbone, so it’s not strangled by the slow pipe into the house, but I also don’t want all of it publicly indexed and accessible. With the photo hosting sites, they’re mostly either affiliated with a photofinisher, and are looking for print and merchandise revenue, or they’re selling to space to people who just need image storage and bandwidth.
From John Battelle’s Searchblog
Mike Homer, of Netscape and now Kontiki, and Marc Andreessen, of Netscape and now Opsware, have launched the Open Media Network, a free platform for the storage and distribution of public video and audio content. I spoke to Homer about the new network, which uses Kontiki’s video serving system on the back end. The system is a mashup of sorts between Tivo and BitTorrent – it has a well considered interface and employes a secure P2P network for file distribution (it doesn’t actually use Tivo or BitTorrent technology).
More fun with Google Maps and TiVo.
Haven’t been looked much lately at developing directly on TiVo hardware vs trying ideas out on MythTV-style PVR platforms while experimenting with video, media servers, and TV-centric information appliances. A while back it looked like you could hack things into the TiVo, but it seemed that the main advantage of the TiVo was that it worked out of the box and it was relatively cheap compared to building your own, i.e. it was an actual product, not a development platform. Building a system from parts, i.e. MythTV on Linux or Windows Media PC, isn’t an end-user-friendly activity, but can give you Unix-like flexibility where everything is possible, but you may need to do it all yourself.
Saw an article about TiVo’s latest plug for developing applications on the TiVo, I should go take another look.
TiVo’s HME Developer Challenge is part of that effort. Consumers with broadband-connected TiVo recorders are still a small number–about 300,000–and so far only about 60 applications are available on the Internet. The deadline for contest submissions is May 1, and the winners will be announced at the JavaOne conference in late June.
“There’s a lot of interest around hacking TiVo boxes…this was a way to help people see TiVo as a platform,” said Arthur van Hoff, former principal engineer at TiVo responsible for the HME project. Van Hoff has even created a program allowing him to control his home-lighting system from his TiVo.
Oops. That’s why you’re supposed to scrub the hard drive before getting rid of the computer…
The United Kingdom’s Ministry of Defense is facing major embarrassment, and the threat of having lost classified military data, based on a man’s claim that he found a large number of sensitive files created by the government on a laptop handed to him at a garbage dump.
I’m still wrestling with what to do about my next notebook computer. I’m probably going to end up with something like an IBM T42 or an HP nc6220, but I keep toying with the idea of changing my entire work setup to be more mobile and perhaps returning back to a full desktop system or something like that.
In the meantime, someone’s found some photos and documents on the unreleased IBM X41 Thinkpad on the FCC web site. (via Slashdot)
You’ve got to love the FCC when you’re craving information for rumored devices and these pics found on the FCC site depict an IBM Thinkpad X41 Tablet PC. No real surprises in these pics, the depicted Tablet looks just like a Thinkpad. Looks like it has Bluetooth and a dual antenna WiFi. Except, of course, for the swivel screen.
updated 04-26-2005: X41 discussion at Thinkpads.com
A couple of days ago, I posted about the Nikon RAW image format and the general issue of access to “digital negatives”. Interest in this topic has been building for a while, and this being the age of instant communities of interest, we now have the OpenRAW group:
OpenRAW is a group of photographers and other interested people advocating the open documentation of digital camera RAW files.
After Canon dropped support for their Canon D30 DSLR in their latest software release and Nikon removed features of their own RAW converter Nikon Capture, plus the encryption of features in Nikon’s D2x digital camera RAW format (NEF), some members of the mailing list D1scussion founded the OpenRAW mailing list to coordinate their efforts to motivate camera makers to openly document their individual RAW formats.
This web site is the first result from this discussion and has the goal to gain public awareness of the RAW Problem.
There is also some followup and the official response from Nikon at DPReview.com:
Continue reading OpenRAW.org, more on Nikon RAW format
The Big Sur Marathon has a well-deserved reputation for being difficult, scenic, and well run. This is my second time at Big Sur, having run it last year (2004) as well. It’s my 3rd marathon overall, after starting as a novice runner in 2002. I’m continuing to build an aerobic base and improving my running mechanics, so each time out on the course is another experiment and learning experience.
During the past year I’ve maintained a base mileage of 45-55 miles per week, with no major injuries. I’ve regularly logged 13-16 mile runs during the past year, but have only gone up to 18 miles on this training cycle, vs the previous year where I put in four 20 mile runs and weekly hill intervals. From my training log paces and HR data I can see that I’m in better base condition than the previous year, but going in I’m uncertain about how things will hold up after 3 hours on the road.
Continue reading Big Sur Marathon 2005
Link to News.com article here
I was just trying out the performance of different video encodings over WLAN yesterday, and had been thinking about scenarios where something like this might make sense, given that people are starting to carry around more powerful client devices.
Singapore Airlines already runs a great video- and audi0-on-demand service on their flights, but it requires essentially a full PC under every seat. Moving to a bring-your-own client entertainment format might not make sense for SQ, but would be a huge improvement for me on United or any other US-flag airline.
Unfortunately, it’s pretty easy to clog up the shared bandwidth with high quality video, and you can’t really solve an “on-demand” bandwidth problem using multicast. I’m not even sure they could count on using 802.11g or a to get the higher bandwidth, as there aren’t that many 802.11a clients around, and a mixed 802.11b/g network won’t give you the performance unless you keep out the 802.11b clients. Going the other way, I don’t think people would be terribly happy with 150-300kbps “broadband internet” quality video streams on an airline flight, but I could be wrong.
Actually, I’d be pretty happy if United would just get power deployed to all the seats so I could run my own gear without needing a ton of batteries to get across the Pacific.
An airline entertainment company is working with Microsoft to provide videos to passengers’ laptops, according to a blog posted by a Microsoft employee.
In the blog, the Microsoft employee said that the unnamed airline entertainment company had recently spent two weeks testing whether it could use a set up a WiFi wireless network “inside the fuselage of a commercial aircraft while airborne.”
I’d like to read more about what they actually tried. The news.com article doesn’t link to the source, and it didn’t turn up in a quick search either.
I rarely sit down and just listen to my music collection these days. Most of the time, any music I hear is on the radio, computer or CD player while driving, or working, or generally doing something else. My largest weekly block of music listening time is using an MP3 player during treadmill workouts.
So, it was interesting yesterday evening when I started noticing how bad MP3 encoded tracks sound compared with the original CDs.
I’m old enough to have actually purchased physical media (first vinyl, then CD) for nearly all the music I presently own. However, I have rarely gone back and played the actual CDs I’ve purchased for several years, as the first step after removing the wrapper is to encode them and put the bits on the file server. When my daughter was a little younger, duplicate CDs were being replaced weekly after being stepped on, spilled on, turned into art projects, and other mishaps. Other sets have been left behind on airplanes, rental cars, etc. Having everything on the server has allowed us to enjoy the music without worrying about physically destroying or losing the original.
Back when I started doing this several years ago, I remember trying a comparison of original CDs vs 128kbps MP3 and deciding that the encoded recordings would be good enough for general use, more or less replacing cassette tape. 192kbps and 256kbps encoding seemed extravagant — disk storage was much more expensive then, I was planning to encode all of my CDs, and the modest degradation in quality didn’t seem too bad.
I may have to revisit this, now that hard drive capacity has gone up, costs have gone down, and for some reason the “audio haze” added by the MP3 encoding has suddenly become more noticeable to me.
My old turntable and albums have been in storage for years, since our daughter was born. It might be time to set them up and give her a demonstration of what vintage vinyl sounds like. I think I might still have the old Mobile Fidelity pressing of Pink Floyd’s Dark Side of the Moon somewhere…
Someone got around to picking apart the raw image format for Nikon cameras. From an article today on News.com:
A Massachusetts programmer says he has broken a proprietary encryption code that has effectively forced some Nikon digital camera owners to use the company’s own software.
Because Nikon scrambled a portion of the file, legal worries have kept third-party developers like Adobe Systems from supporting Nikon’s uncompressed “raw” photos in their software. Nikon sells its Nikon Capture utility for $100.
Nikon’s white-balance encryption had hindered photographers who preferred other, sometimes faster or more capable, image conversion software by making it infeasible to convert large numbers of images. Canon–which bundles its raw conversion software with its cameras and does not charge extra–does not encrypt its photo metadata.
As a photographer, I absolutely want access to the full image data — effectively the “negative” for the digital image — especially if I’m going to purchase a high end camera. I generally use Canon equipment already, for a variety of reasons, including raw image support. I’m not sure why Nikon wanted to secure the white point data. It doesn’t actually make much sense for them to prevent their customers from using it; even if Nikon thinks their color processing path is better than any 3rd party software, charging their customers for what is arguably a basic function of the camera is silly. Maybe there’s some other reason they think it makes sense.
The legal ambiguity may leave Nikon owners without full support from Adobe for a while, but at least there will be some tool support available:
In an e-mail message late Thursday, Bibble Labs founder Eric Hyman said he had also broken the Nikon white balance code and had incorporated it in the latest version of his commercial image-manipulation software. Bibble Labs sells the full-featured version of its “Bibble 4″ software for $129, and a less-capable version for $69.
update 2005-05-10 15:42 roundup of followups at Metafilter
An article in the Red Herring on business opportunities in poor/developing/rural economies with some specific examples of “locally appropriate” products and services. C.K. Prahalad is heavily quoted, as usual.
“There are five billion people in developing countries that are currently underserved, but can’t wait to join the global economy,” says Coimbatore Krishnarao Prahalad, a University of Michigan professor and author of best-selling business books, including The Fortune at the Bottom of the Pyramid: Eradicating Poverty Through Profits.
Consumers “at the bottom of the pyramid”—as Mr. Prahalad refers to the poor—can’t afford the same products as Western consumers. On average, they earn less than $2 per day. Mr. Prahalad, considered one of the world’s most influential business thinkers, believes companies can make a profit targeting this market, if they make their advanced technology affordable.
Continue reading On Business Innovation for the “Bottom of the Pyramid”
The iceberg B15A is 71 miles long and has been adrift off Antarctica for a while. It finally bumped into the 43-mile-long Drygalski “tongue” of ice extending offshore from the “David” glacier yesterday. Satellite photo here.
Link via Metafilter
An image snapped by the European Space Agency’s Envisat satellite on 15 April shows a 5-km-long section of the ice tongue breaking off at its seaward end as the bottle-shaped iceberg brushes past.
B15-A is the largest remaining section of the large B15 iceberg which broke away from the Ross Ice Shelf in 2000. Scientists have placed a Global Positioning System device on it to track its movements.
Answer 10 questions and see if you’re an Isolationist, Liberal, Realist, or Neoconservative. (via Metafilter)
According to my responses, it thinks I’m a Realist:
Not a perfect fit, but closer than the other categories. “Isolationist” seems to be the opposite of the work I’ve been doing for a while. I sometimes get labelled “Conservative” or “Liberal” depending on if I’m here in the Bay Area or visiting old friends back in Maine. There isn’t a category for “Idealistic Pragmatist”, so I guess that leaves me as a “Realist.”
This makes sense. Although Adobe and Macromedia have competed on the content creation front over the past years, starting out from the print world in Adobe’s case and the CD-ROM world in Macromedia’s case, this should allow the combined organization to focus on making the existing tools play better, and move on to the broader problem of document and information management.
Life will be just fine for the existing customer base of print and interactive developers, who will probably end up with a toolbox of Photoshop, Illustrator, Dreamweaver, Flash, InDesign, and Acrobat, each of which are great, even dominant, in their categories, have loyal user communities, and will become more useful as they become better integrated.
The more interesting question is the one the merger is predicated on, which is how to address the broader space of document workflow and information management.
Adobe has been training everyone to think of PDF as “electronic paper”, in that it behaves like a printed document with well defined, mostly fixed presentation of a text and graphics. This is mostly how it is used today, literally replacing paper documents in print-on-demand applications such as product literature, distribution of paper forms for health, government, and corporate applications, or formatted output of books and publications.
Macromedia Flash, on the other hand, is geared toward dynamic graphic presentation, and almost nothing is static, but it can and typically does retrieve new underlying content to be presented through the Flash software client. Complex multimedia presentations are routinely implemented in Flash, including entire web sites. Flash is a more a programmable content presentation system than anything else.
PDF excels at migrating a paper-based workflow model to an online environment, because it turns paper documents into something digital that can be moved around electronically. While paper isn’t going away for a long, long, time, if ever, the problem in the corporate / enterprise space is that we may be moving to an environment where we aren’t starting out with static data very often, and the document is coming from an array of content sources. This is an area where Flash might do well, if other approaches such as Ajax don’t solve enough of the problem.
Look at this blog or any news site as a very simple example. None of the articles posted are actually fixed documents, they’re all facets of an underlying database of content. Now think about the information handled in various corporate workflows. A lot of it already lives in an assortment of databases, some of them actual “databases”, but also many document files scattered across the hard drives of the company. A lot of documents floating around are really a sort of snapshot of a particular view of the underlying data at a point in time. Taken further, we get XML-based interactive documents such as Google Maps, or similar applications such as these demos at Laszlo Systems. It’s not a stretch to imagine existing blogging software wrapped around existing databases and data sources within an enterprise and publishing RSS feeds which are automatically aggregated into “workflow” documents, this is already starting to appear in bits and pieces.
Hardcopy output has a huge advantage in being persistent technology — we can be reasonably sure that the paper document can be read in 50 years, while the same can not be said about the PDF document on CD-ROM, DVD, or any other current storage media. But it also seems that a direction for “documents” will be towards presenting faceted views of the data content available to the publisher. PDF has the “paper” part covered. Flash is useless for “paper” but has a great installed base of dynamic presentation clients.
I would find it disconcerting if the “paper” PDF documents started updating themselves with much more than customer contact data or similar, and I don’t think I’d trust a Flash web site to give me the same content from one week to another. That might be just an age thing, since I’m used to “paper” behaving a particular way, which might change in the future.
Adobe has needed to do something on the enterprise side for years. After bringing in Macromedia, they’ll still need to find a way to address the content / information management side, but Flash seems like a better fit to interactive documents than retrofitting dynamic presentations into PDF.
Next, they need to link up with some content management / database / XML solutions that are both human-friendly and auditor-compliant.
comments at kottke.org
discussion at slashdot, followup discussion at slashdot
This turned up in e-mail this evening, more reasons to like Flickr…
You may have heard on the grapevine that we planned to
reward our dear Flickr members who bought a Pro Account in
the early days. Well, it’s true! And since you’re one of
those lovely people, here’s a little something to say YOU
1. Double what you paid for!
Your original 1 year pro account has been doubled to
2 years, and your new expiry date is Apr 8, 2007.
2. More capacity!
Now you can upload 2 GB per month.
3. 2 free Pro Accounts to give away to your friends!
This won’t be activated for a day or two, but when it
is, you’ll see a note on your home page telling you
what to do.
Thank you so much for putting your money where your mouth
is and supporting us, even while we’re in beta. Your
generosity and cold, hard cash helped us get where we are
Yogi Berra once said that “Baseball is 90 percent mental. The other half is physical”, which also applies well to marathons. It’s taper time, and we’re clearly working on the mental part of the game.
With seasonal allergies in full swing, I wake up feeling tired and groggy every morning, so it’s easy to be anxious about whether I’m in shape to run Big Sur, especially without the reassurance of completing a decent workout every day. This week I’ve been mostly doing short, easy 3-5 mile runs, vs the daily 9-13 mile runs I was getting in a few weeks ago, since part of the reason for tapering is to let the body recharge a bit. Unfortunately, while that’s good for the body, it’s tricky for the mental aspect of preparation. In my case, I know it can take up to an hour of easy running before I get comfortable, so between allergies and short runs, most of my workouts this week have left me feeling slightly uneasy. Today’s workout was a full hour, and felt awkward for the first 30 minutes but the last 3-4 miles clicked off cleanly at MP-to-LT paces, which makes me feel better for now.
Since I’m running less this week, I’m spending more time on planning. Today I’ve been debating which shoes to run in. I’m down to my last pair of Saucony Hurricane 5′s, which I have been alternating with the Asics GT-2100 for the past few months. The Hurricanes are a little heavier and provide more foot protection, and I’ve used them for thousands of miles of training, making them a safe choice. Unfortunately, they stopped making the 5′s over a year ago, I didn’t like the Hurricane 6, and haven’t had a chance to try the Hurricane 7′s yet. My current Hurricane 5′s are the last of the ones I stashed away when they were discontinued, and they’re at something like 300 miles, which is where I typically stop using them for long runs. They occasionally have some at Road Runner Sports, but not in my size lately.
The GT-2100′s have been working ok so far. They’re lighter than the Hurricanes, but aren’t “lightweight” like the New Balance NB900′s I got a while ago for shorter/faster workouts. I didn’t like the Kayano IX or the GT-2090 for some reason when I tried them a couple of years ago, but the GT-2100′s haven’t given me any discernable problems. My stride mechanics are substantially better, and I weigh less than when I started running, which makes the extra cushioning in the Hurricanes less critical now than back when I originally picked them. However, I can definitely tell that the GT-2100′s are less “cushy” after 13+ miles compared with the Hurricanes, not sure how they’ll feel after 26+. They feel a lot worse if my form gets sloppy, which is an incentive to pay attention.
I’m probably going with the GT-2100′s.
Just over a week left before the Big Sur Marathon, found this blog linked from the web site. Reading about these guys over at the the Monterey Herald makes me feel a little better about how this training cycle has been going.
The Longest Mile is an online diary following the triumphs and travails of Ken Ottmar and Jon Segal, two overweight, out-of-shape, newspaper desk jockeys training for the brutal Big Sur Marathon. Come and taste the pain.
Just looking at these guys makes my feet and knees ache. Hope they do ok next weekend, or at least avoid major injuries…
During lunch yesterday, I spent a few minutes with Netstumbler to test a simple cantenna intended for use for low cost rural community networks. I will write about the cantenna separately, it’s based on this design and provides around 8dB of gain. Even more valuable for a cluttered RF environment (such as around here), the directionality of the antenna reduces the noise floor substantially. With the directional antenna, the noise floor was around -88dBm, vs around -66dBm with the built-in omni.
An informal survey from my office (sitting in my chair with the antenna and revolving through 360 degrees a few times) turned up 24 access points, 11 of which were unsecured. I expected to pick up a few networks while pointing toward the window, but I was surprised at how many popped up while pointed through the opposite side of the building. It probably helps that I’m on the 2nd floor, but this was more than I expected. A similar experiment a couple of years ago turned up only 2 SSIDs, not including mine.
SSIDs picked up:
143Rinconada, 2WIRE517, 2WIRE626, 2WIRE778, Alma Zone,
Andrew's Network, bmillin, dolev, Home, Home,
hughes-wi-fi, hughes-wi-fi, linksys, linksys, Linksys,
linksys-g / Palo Alto, Linksys-PA, LR, NETGEAR, NETGEAR,
settlers, spyfox, TASAR-HOME, zephyr
Perhaps I should see if anyone’s interested in setting up a bandwidth co-op, since Palo Alto Fiber-to-the-Home seems to be stalled. It’s sometimes frustrating to see how slow and expensive internet service is here by comparison with Korea, among other places.
Lots of interesting Google topics recently. Yesterday, Google launched Video Upload, inviting uploaded video to be indexed on Google in the future.
This is fundamentally different from Google Search as it exists today, in that
- Content needs to be explicitly uploaded to Google vs being spidered automatically
- All indexed content has a claimed owner (need a Gmail account to upload)
- Licensing information is built into the search metadata at Google rather than at the source
It’s unclear to me whether Google becomes the primary content server or if only metadata is served to video search clients, leaving the actual content delivery to the owner. Although Google currently makes a cached copy of web content, today’s searches are normally directed to the source URL rather than being served from the Google cache. Turning the Google infrastructure into a global media server seems like a plausible direction to consider, though.
This also seems to lay the groundwork for something like an Apple iTunes Store for video and other media, with content from both individuals and commercial organizations.
More details on the Google Video Upload FAQ
What is Google Video?
Our mission is to organize the world’s information and make it universally accessible and useful. Currently, Google Video lets you search a growing archive of televised content — everything from sports events to dinosaur documentaries to news programs. In addition to televised content, we’re now accepting video from anyone who wants to upload content to us. Uploaded content will not be immediately available to users searching Google Video as this is just the submission stage of the program. But (if you’ll pardon the pun) stay tuned.
What is the Google Video upload program?
The upload program lets you submit videos electronically to Google Video, as long as you own the necessary rights (including copyrights, trademarks, rights of publicity, and any other relevant rights for your content). Just sign up for an account and use our upload tool to send your videos to Google. The program is still in beta so you won’t see your videos live on Google Video immediately.
To make sure that your video is submitted properly, please read below about preferred file formats and our approval process. Videos may not go live if they’re not approved or if we’re unable to accept the format.
Initial thoughts on contributing video:
- From a commercial point of view, searchable video would be a great benefit, provided you didn’t lose control of the content.
- From an individual artist’s point of view, this would be a huge win, since there are so few distribution mechanisms for short films and multimedia projects.
- From a casual user’s (family videos) point of view, I’m a little unclear. There are clearly people happy to have their entire life published to the world in perpituity, but I suspect this is the minority. For my own photos and videos I’d like to be able to search, but not to have the content accessible to the world at large. So a personal version of this might be useful.
Comments at Slashdot, BoingBoing
This is a cool search/navigation application for Flickr, complete with entertaining animation.