Archive for the ‘Tech-geekery’ Category

Curious Photoshop/Mac/WiFi behaviour

May 27th, 2011 by Reinder

One for the Mac-heads in the audience.

I've been trying to cut distractions while working on comics, which means that I turn off the WiFi on my Macbook now. However, this doesn't work as well as it should, because the following behaviour occurs:

  • If I switch off WiFi (AirPort) and then fire up Photoshop CS 4/Mac, Photoshop takes forever to start, and so does Adobe Bridge for that matter. In fact, both apps become unresponsive and have to be killed using the Force Stop feature.
  • If I fire up Photoshop while WiFi is on and then turn off WiFi, I have no problems until I try to save a file. Then Photoshop becomes unresponsive for minutes on end, and I can either try to wait it out or switch on WiFi again, after which Photoshop becomes active again and finishes saving the file quickly. I don't think I've ever succesfully waited it out since I started noticing the problem - I have stuff to do, after all.
  • If I reboot the Macbook while WiFi is off, I have no problems, everything works as it should, and I get a few hours of distraction-free Photoshopping.

Actions taken: I have checked my Photoshop preferences and turned off Version Cue, as that is no longer operational anyway and this would be an obvious cause of the problem if it tried to connect to the Version Cue server and failed to find it. This did not, however, make any immediate difference.

Later, I will look for causes of this phenomenon, and maybe write up a proper bug report and then figure out who to send it to. For now, though, I'm getting off the electronic boob again. I have stuff to do, after all.

Scribus sucks, but its suckage can be defeated with patience

November 16th, 2010 by Reinder

What is this?

Screenshot from Scribus

Screenshot from Scribus

It's a screenshot of the open source DTP program Scribus on a Mac with the 143rd episode of The Corby Tribe open, as it is being prepared for presentation on my Drunk Duck mirror. The image is a Photoshop file with adjustment layers, and is not rendered correctly. However, it is at least rendered, so that with this image, I could in theory tweak the size until it fits.

I can't blame Scribus for not showing giant multilayered Photoshop images correctly. It's no big deal to flatten the image, turn it into a PNG and try again with a smaller, less complex version of the same file, right?

That brings me to this. Guess what this is:

Another Scribus screenshots, showing a seemingly empty image box

Another Scribus screenshots, showing a seemingly empty image box

You might answer, "This is the Scribus file from before you placed the image into the box!" and you'd be wrong. This is in fact Scribus after I created a PNG version of the image, and placed it into the box. The preview settings are set to display the image (why anyone would not want to display images in a page layout program is beyond me, but then I'm a mere amateur at this). The PNG is not rendered at all.
"Oh well," you might say. "Didn't you say before that Scribus was a bit buggy and had different issues across different platforms, but that it was better than nothing at a price you can't beat?" And indeed I did! So maybe the bug-of-the-month for Mac users is that it can't render PNGs. But we know it can display Photoshop files, right? Even though it doesn't do layered Photoshop files well? So why don't we create a flat Photoshop file and use that?

Here's the result:

Scribus for mac with a flattened PSD file

Scribus for mac with a flattened PSD file

Wait, did I just post the same image by mistake? Nope, I went to the trouble of making a whole new screenshot. Dunno why I bothered really.

So how about using a TIFF file? The file format that absolutely did not work two years ago when I tried to do the same work using Scribus for linux?

Hey, this works. So from number 143 on, I have to use TIFF as an intermediate file format

Hey, this works. So from number 143 on, I have to use TIFF as an intermediate file format

So that's one problem solved. From episode 143 on until I switch OS's again, I'll just use TIF as an intermediate file format. But it's amazing how much Scribus sucks. I mentioned before that it was easier to format text in OpenOffice and import the formatted text into Scribus!Linux!2008
than to format the text in Scribus itself. Last time I used Scribus on a Mac, that strategy did not work at all and resulted in many character rendering errors. Since the text for this episode was actually prepared a year ago, I have not put that to the test, since then, and I dread doing so with the next episode, whenever I go to create that.

Other open source alternatives to expensive commercial programs usually work well for basic functions, and the degree to which they work for advanced functionality depends on how much the development community cares about them. Scribus is unique in turning even basic features such as importing an image into a headache that returns every time you are on a different computer. And yet it has won the Packt Open Source award and is in fact used by professionals who do complex things with it. Unbelievable.

There’s no better welcome home and no better start to the new year than a dead computer

January 4th, 2010 by Reinder

As you may remember, the day before I left for Tennessee in mid-October, my Macbook died (it got better, especially after I gave up running Parallels on it).
I just came home from Tennessee after a trip that included delays, lies about delays, the worst airline information I have ever come across and 1 hour and 45 minutes of standing in line for the Lost Luggage service because Iberia airlines, the worst airline I've ever flown with, is part of a partnership that can't be bothered to assign more than one person to the lost luggage desk. On returning to my apartment, I found most of it OK, but within about an hour of me switching on the PC, it dies suddenly, and it doesn't take a lot of work to find out that once again, a hard drive has given up the ghost. Of course, all my data are backed up. Of course, all my back-ups are in a suitcase that's stuck in Madrid and that Iberia may or may not manage to retrieve. Except for the one that is in Aggie's house in Tennessee.

I really feel like I can't win. Multiple redundancy gets defeated by multiple points of failure failing at the same time. And you know what? I've had enough. I'm not replacing that drive. I'm retiring that box now and will be working exclusively on the resurrected Macbook until either that dies permanently too, or I have saved up enough money for a really good new desktop. No more rear-guard battles and hurried replacements for me. I have better things to do with my money than buy replacement parts that get blown up within the year. I didn't use that desktop for nearly three months; I can live without it.

I would, however, like to get my life's work back. Not having that at my fingertips in any form makes me very very nervous and twitchy.

Running Windows on a Mac still to be considered harmful

October 27th, 2009 by Reinder

Reader Kitchenbutterfly asks:

Why have you burst my bubble? I've been living in paradise, claiming the MAC and all things APPLE to be the next best thing to sliced bread, or at least windows! And I know about buying computers in a hurry.

Well here's the thing: I loved my first iBook. I never had any serious problems with it. But it was getting old, it was a G4 and there were certain things it couldn't do that would come in handy for my long-distance work. Like run Windows in some form or another. So for Christmas, Aggie, who is sweet and loving and obviously completely crazy, gave me a new MacBook. I immediately started messing around with both Boot Camp and Parallels Desktop to figure out which setup would work best for me (there is a third option, VMWare Fusion, which I haven't tried and right now don't have the heart to). It is now turning out to be the answer that neither work well enough for real work, and both are harmful to the safety of my Mac hardware and my data.

The Boot Camp arrangement did not survive the first five weeks of long-distance work over the summer. The final week was spent doing whatever I could to get work done on one of Aggie's computers. Since one of them did not want to work with the Logoport online translation client, and the other did not want to let me install SDLX*), this took a lot of moving back and forth between computers. Then I took my bricked Macbook home to Groningen to see what I could do about it.

Meanwhile, some changes in our company's VPN software allowed that to work with a Parallels virtual machine, which it hadn't done before. Wonderful! I could run Windows in the VM, keep all my data safe on my Mac folders, and access my Mac software while working in Windows.

Well, I could, right until I upgraded to OS 10.6 Snow Leopard. Parallels 4 is supposed to work with it and the company even has a nifty new upgrade to make it work even better. It was while checking my Parallels VM while preparing to upgrade to that nifty new upgrade the day before leaving for the US that my MacBook became seriously bricked again.

I took the bricked MacBook with me to Tennessee to see what I could do. Three thirty-mile drives to Murfreesboro later, we had a diagnose of OS corruption, which the repair guy said actually happened quite often. They offered to wipe and reinstall for a mere $130, I said no, we'll do it ourselves, thanks, and we took the bricked box home, wiped it, reinstalled it, and restored it to its state of October 13, 2009 using the magic of Time Machine. Time Machine is excellent, but I'm finding myself using it a little too often.

I went to work using the Virtual machine and all was right with the world. I spent the Sunday before I was due to get back to work installing my software on the VM, and it was good. On Monday, I went to work, and all was good. What I didn't realise was that the reason all was good was that Parallels was unable to download its nifty new update over our slow internet connection (see previous post). But at the end of the first working day, it had somehow snagged all 110 or so MB of it and prompted me to install it. Foolishly, I did. The installation ended with an error (something about a required file missing - even though this was an automated download that should have got everything) and my VM no longer worked well. Using Time Machine, I tried to restore the software to its last version, which worked, but restoring the actual VM file (an 8 GB monstrosity) turned out to be harder. This is probably because the VM had been running whenever the Time Machine back-ups were made, so what ended up in the back-up was not a workable file to boot the VM from. After repeated attempts, running Parallels caused the Mac to hang again.
So now I'm restoring it again to the state it was in on October 13, 2009. After that, I will turn off all update functionality in Parallels, reinstall the software I need and hope for the best until the new PC arrives here (working on Aggie's machines has become problematic for other reasons that I don't want to go into as this post is already quite long and nerdy).

And that is the tale of my MacBook woes. Some of my woes are clearly the result of human error (upgrading anything that works is risky and with the Parallels upgrade, there was already a known risk factor), but I'm beginning to think that the main human error here is wanting to run Windows on a Mac in the first place. I get a lot of joy from using that machine (and I do mean actual pleasure in using it as opposed to merely finding that work goes smoothly and the computer isn't an active obstacle) whenever I use Mac software on it, whether commercial and actively developed for the Mac, or open source and ported to the Mac. I get nothing but grief and a great deal of learned helplessness from working with Windows on the same Mac. So the lesson here is that Macs should be used to run Mac software; score one for the Cult of Mac, I guess.

I'm stuck with Parallels for a few more days. When the new PC arrives, it will be gone, and good riddance.

*) Incidentally, if you love well designed software, translation software will open your life to forms of horror beyond the imagining of mortal men. If translation software can be said to be designed at all, it is designed based on the interests of anyone but translators. SDLX Suite, at least until its most recent version released this year, was not designed at all - it was a Frankensteinian patchwork of previously unrelated programs that the SDL company had bought over the years, that had no single interface vision and which only worked together through filters and a gigantic super-interface for project management and bundling. I have heard that the new release is better integrated, but its backward compatibility is nonexistent. This is relevant here because the installer alone is half a gigabyte and requires several steps of pre-installation taking several minutes before it even begins to try to install any of the component programs.

How I crave real Internets

October 26th, 2009 by Reinder

Reader Branko asks: "Reinder, how is life in the new fatherland? Have they internets there?"
Well I don't know about the rest of the USA (it'll be a while before it's really my new fatherland as I won't even be getting my fiancé visa until early next year) but here in rural middle Tennessee, life is pretty good except that the answer to the question about the internets is "yeah, kinda sorta". We're way out in the boonies and that means that what internets we get come with conditions that the civilised world has long since forgotten about*): data limits, bandwidth throttling, overage charges and dropped connections when the weather is bad or the moon is in the house of Jupiter. To be able to do my long-distance work at all, I need to switch between two internet connections, both of which we pay through the nose for. We've got Aggie's satellite dish connection that throttles you to slow modem speeds once you've reached the daily data limit of about 500 MB - an SDLX translation memory file for one of our larger clients will get you halfway there. The satellite service also limits the number of separate connections that can be made and if I'm sharing it with one of Aggie's sons playing World of Warcraft, it gets pretty slow.
The other connection is the cellular internet connection that I pay more for than I do for high-speed bandwidth and cable combined back in Groningen. It too has a data limit, which is even more draconian at 5 GB a month, but at least I have it all to myself and it never actually artificially lowers the speed. Out here, the reception is pretty poor though and rare is the day that it shows more than two bars out of four. The closest thing to a credible competitor that Verizon has here, AT&T, is not reachable at all and the only time I can read messages on my AT&T cell phone is when we drive out to Manchester or Tullahoma.

This is the biggest obstacle I am facing to working long-distance: the connectivity simply isn't good enough to push around the files I am working with. There is some prospect for improvement as there are still a lot of houses being built in the neighborhood and the demand for broadband will eventualy be there. Still, it's a pity that all the Federal stimulus money seems to have gone to repaving the roads that were already there instead of building new infrastructure such as broadband cables.

Anyway, I hope that this explains why posting here may be even slower than usual: on working days, I am being throttled and on weekends and vacations, there's things to do in meatspace that after a week of dealing with this sort of thing, I'd much rather be doing.

*) Belgians take note: you are not living in the civilised world, and unlike the people out here in the boonies who simply don't have the infrastructure, you have yourself to blame for tolerating the limitations your ISP's impose. A few well-aimed bricks through the right windows will help you shed your data shackles.

Just for once in my life, I’d like to not have to buy a new computer in a hurry

October 16th, 2009 by Reinder

With 36 hours to go before my next flight to Tennessee, the Macbook dies. That means that
a) I get to buy a new hard drive for my Macbook just to have access to my files (music including my vinyl album rips, scans - the paper originals for many of which I have recently thrown out) minus the ones added since I last ran Time Machine;

b) I get to take all my installation materials to Amsterdam and install them at the address where I am sleeping over so I can catch my plane in the morning. If that doesn't work, I get to take a bricked laptop to Tennessee and try again while I'm there;

c) because Apple can't be relied upon to make hardware that survives even a short period of intensive use, instead of doing it all through Parallels Fusion on the Macbook, we get to buy a Dell box in a hurry for the long-distance work I will be doing. We do not get time to think about what precisely we want - we get to order quickly and hope it's up and running before my . Just like with the current desktop at home in Groningen, and the studio machine before that, and the studio machine before that. Other people sometimes get to ponder their aging systems and say "Gee honey, maybe we should save up a bit of cash so we can replace this old box." I have not been in a position to do that for five or so years. I get to replace dead machines in a mad rush to meet the next deadline;

d) I get to stay up late to complete the preparations for my trip that I was working on at the time the laptop gave up. Obviously I don't get to do the ones that involve installing software on the laptop, but I did lose 90 minutes just trying to diagnose the problem (see: opaque operating systems and why they're a bad idea even if they're pretty);

e) I get to lose all the money I saved through 5 weeks of stepping up the frugality. Isn't it wonderful to be me?

Well at least I'll be seeing Aggie again in two days. So it's not all misery.

A standard procedure for digitizing LP’s.

October 4th, 2009 by Reinder

I'm really happy with my USB preamp for digitizing vinyl records and kinda wish I'd bought it years earlier. It'd have saved me a lot of money in CD remasters and other forms of buying-the-same-damned-record-again. What it allows me to do is remaster the records myself according to my own requirements. I find that I can get a very clean, dynamic sound out of most records I own, without having to worry about the digitized end product being clipped or overcompressed due to the Loudness war - even though I do normalize them to be pretty loud.

And I'm also finding that, because my records are for the most part very clean, it's dead easy. Here are the steps I take to digitize an LP side. These steps assume you have Audacity and a reasonably clean, modern lawnplayer:

Start Audacity with the USB preamp hooked up (doh) and the correct preferences set.
Put record on turntable and start. If necessary, clean record with antistatic brush.
Hit record button on Audacity. Drop needle.
Wait 20 or so minutes, listening to the record to make mental notes of bad clicks or other rough spots.
When the needle lifts, stop recording.
Using the visual representation on screen, delete the bits before the needle drop and the bits after it is raised. Keep some of the "silent" parts before and after the end of the music.
Save file with the naming/folder sorting scheme of your choice. I sort by artist and album but give the actual files basic names such as "Side A" or "Side B".
In the main menu for Audacity, go to Analyses menu and select Silence Finder. Accept the default settings and hit OK. A label track will appear marking silences of a second or longer with an "S". Usually, the mark appears right where the next track begins. Go through the markings, checking them as necessary and editing them so they show the names of the songs.

While doing this, if you know any spots where there are major clicks, or you spot them as you go along, mark them on the label track as well, so you can cut them later.

In the main menu under File, go to Open Metadata Editor. Fill in the Artist name, album name, year and genre, but nothing else, and hit OK.

These steps should give you a raw recording with everything properly labeled. The reason I do the labeling first is because I don't want to endlessly repeat listening to the album. I also don't like cutting the album into separate songs until the final stage. But if you want to do that, that's fine. The next few steps affect what is actually in the recording. Because Audacity does not support non-destructive edits, now would be a good time to save.

Next up, I usually normalize. You don't have to do that, but I want the tracks to hold their own against the tracks I already have in iTunes. "Loud" tracks get normalized all the way to 0 dB - after which I check for clipping. "Softer" tracks get normalised to -0.5 dB, the default value.

Then, go back to the clicks you've marked, zoom in on them until you can see their own wave form. Select it and delete it - it is usually just a few thousandths of a second, and no musical information is preserved in a major click, so no one will notice it. It pays to use your eyes - select so that the waveform you get after deletion looks uninterrupted.

Next, select some of the silent bits from the beginning or end of the record and go to the Effects Menu -> Remove Noise. In the window that opens, hit Create Noise Profile. Next, select the entire recording, go to Effects Menu -> Remove Noise again and start experimenting. You may want to try with the default values first, but I think that removes too much noise at the expense of the overall dynamics of the recording - after all, it uses an algorithm to guess at what is and what isn't noise, and sometimes gets it wrong. I usually end up taking out 10 dB or less - especially if for some reason I haven't been able to normalize all the way up to 0 dB. You're going to have to use a lot of trial and error here, and this is where you're most likely to get it wrong and have to redo the work. When in doubt, skip the step entirely and live with the noise floor.

Once you're satisfied, select individual songs using the markings you made earlier, and export to the format of your choice using File -> Export Selection. You'll be prompted again with the metadata editor window, and this is where you enter the track name, number and the year if it's not the same year as the other recordings on the album. Don't use Export! It will export the entire LP as one track.

After that is done, I usually close without saving so I keep the unedited music in case I'm unhappy with the results later. This sometimes happens, but by this time I usually have a recording that is clean and loud without being clipped or smooshed. And because the normalization and noise reducing steps are really macro steps that don't require close interaction with the recording, I can reproduce them easily.

This is my approach; there are others and I may change mine as I learn more. If you like separating the songs out early, you can do so. Noise removal is more important if you use headphones a lot; I find the existing noise levels to be less of a problem on speakers. Because I expect to use headphones more in the future, I am hedging my bets here.

On A/D conversion, advice and the weakest link in the chain

September 29th, 2009 by Reinder

Reader Branko asked me privately what brand/type of USB preamp I'd bought, because he too wants to digitise his vinyl record collection. I'm really the wrong person to ask, because the entire comparison shopping process for me was to go to Okaphone, a local electronics store catering to DJ's, and ask for the recommendation of the guy behind the counter. He told me they had two models in store, they were both the same price, and people had the best experiences with the one from JB Systems, so I bought that and got to work.

It didn't make sense for me to put in more effort, because
a) I'll be leaving the country again in three weeks and don't have time for agonising over specifications and price/quality ratios (beyond what's obviously sensible) if I'm to get any digitizing done; and
b) The other components aren't exactly of audiophile quality. The weakest link in my audio chain is the turntable, followed by the speakers, amplifier, room acoustics and my damaged ears.
The turntable is over 30 years old, the automatic start/stop no longer works well, the platter scrapes the console deck when I place a 180 grams vinyl record on it before I let the needle drop, and it's only a matter of time before the next issue rears its ugly head. In fact, I had endless fun, for a given value of 'fun', over the weekend trying to figure out what was making it sound so fluttery all of a sudden. I bought a new, shorter belt for it, which worked well with my LPs but caused horrible machine rumble with 45s, so now I switch between the old belt and the new belt whenever I need to change the rotation speed. I get good sound out of it all but it takes a lot of work, and if I wasn't going to emigrate within a year, I'd definitely replace the turntable. Luckily, the cartridge is new so I won't have to worry about that.

The sound card on my MacBook is good, and the USB device seems to be doing a good enough job. I'm satisfied with it under the actual operating conditions I'm working in.

So my advice to Branko is not to sweat the choice of A/D converters too much but make sure your turntable is OK—is the cartridge new and of high quality? Is your drive system (belt or direct) dependable? Also, are your records clean? Mine are, but my mother's still have dust on them after repeated cleanings with different methods, and it is affecting their sound. I am pretty sure that all things being equal, those will affect your experience a lot more than doubling your expense on the A/D converter.

Adam’s basic sound restoration tutorial

September 20th, 2009 by Reinder

I asked Adam for some pointers on restoring old records, and he spent a whole night creating a tutorial podcast on basic sound restoration, which tells me everything I need to restore my mother's old Vienna Boys Choir recordings*). "Basic" here includes a lot of things I had already figured out but it also includes the next few steps to create the right balance between noise removal and preserving the freshness of the original sound.

Adam takes a leisurely approach, playing back the unedited recording in full at the start of his podcast, which makes this good for casual listening over breakfast and coffee. We discussed his click removal approach a little in private while I was listening to the podcast. His argument is that people won't hear the disappearance of a few thousandth of a second, but if you're really finicky about preserving the tempo, you can do what I've been doing, which is take a piece of music from just before a click that is the same length of a click, and paste it over the click. If the recording is repetitive enough, you can even drop in a bit from another repeating part. Another comment I have is on his comment that Audacity's built-in click removal effect is good enough for modern recordings. I told him that the day before the podcast, but actually, it's really only 75% good enough; I've had to do manual removal a few times even on my own 45s from the 1980s.

*) It was my mother's request for help that got me interested in transferring my own records and wanting to learn more about sound restoration. It's been a lot of fun and got me reconnected with records I haven't listened to in almost 20 years.

Return of the Son of the End of Free, Part II

May 17th, 2009 by Reinder

Over the years, I've become skeptical of paid content as a viable model for most of the content being published online, particularly for webcomics. In the previous part, I discussed what I believe are the reasons micropayments have historically failed and free content resurged in the mid-2000s.

I believe Rupert Murdoch's plan to start charging for The Wall Street Journal online will also fail, but for a different, much simpler reason: this recession is much worse than the last one, and end users are keeping their wallets shut much more. Even if the problems with the infrastructure and the immediacy of micropayments are resolved and users finally start understanding the concept, they are going to pinch their pennies, hard, and refuse to pay for anything they can get for free elsewhere. Entertainment, which is much less fungible than news, will not be safe from this: if the money simply isn't there, people won't buy it and will instead go with the inferior good that they can afford. Or they will simply entertain themselves: the choice won't be between a paid Radiohead album and a free Hootie and the Blowfish album as Scott McCloud argued in the essay I quoted in Part I, but between a paid Radiohead album and a game of Monopoly with the family, a free knees-up at the Irish pub or an hour practicing Radiohead songs on the guitar.

However, there are two long-run scenarios in which I micropayments and subscriptions may win out. I hope these won't come to pass as neither of them will be pretty. They are The Big Content Squeeze of 2010 and the Google Power Grab Scenario. In both, Rupert Murdoch's assertion that the Internet will never be the same again will be correct.

The Big Content Squeeze of 2010
The recession continues through 2009 and into 2010 and it hits hard. Initially, this means more free content in the form of blogs as newly unemployed people turn to writing. However, it becomes harder to finance the content. Small-time bloggers move from their own hosted space to free bloghosts to save money. Then the free bloghosts stop being free and the blogs vanish.
Meanwhile, newspapers stop treating their free content as loss leaders, and start seeing them as the profit-eaters they really are. Some switch to micropayment solutions, which fail, before shutting own their sites. Others shut down at once. This robs the remaining bloggers of much of their material, because most news/politics/gossip/satire bloggers do not do original news gathering and are entirely parasitic on the so-called Mainstream Media (the idea that bloggers are "citizen journalists" is pure, unadulterated Bloggocks). The quality and interest level of those blogs drops and so do their revenues. Bit by bit, the entire Long Tail of all websites disappears. Comicspace loses its advertising revenues and its venture capital funding at the same time. Keenspot loses its advertising revenues. Both firms close their doors and only the most succesful comics hang on to their existence, on independent hosts and subsidized directly by their users. Eventually, the Short Head, the highest-quality, most popular websites, starts getting eaten as well. By that time, though, content is no longer abundantly available and is indeed getting quite scarce. People who want to read news or blogs or webcomics online have the choice between paying for them or not getting any at all. In this new landscape, micropayments are a viable model once the recession starts bottoming out. By time the recovery is finally under way, micropayments and the sites financed by them are entrenched, the infrastructure for content paid for by advertising is dead and gone and new, free content sites will not be immediately competitive because users will be loyal to the content they have already paid for.

This end result, of course, isn't all bad. The result of this Darwinian process will be a smaller number of sites that have high quality by a number of metrics. They won't waste the users' time, they will be well-made and worth paying for - for a time, at least. They will also have to stay strictly within the mainstream and within the boundaries of acceptable opinion and taste. There will not be a significant Long Tail of niche sites. As the successful media get entrenched, the lack of competition and the need to avoid giving offense may lead to blander, less interesting content - it will continue to very be good at a technical level but will it challenge the reader? And if it doesn't, where else will you go if you do want to be challenged?

The Google Power Grab
This scenario, on the other hand, is one whose outcome won't be good at all. In this one, Google develops a working micropayment system (currently, Google Checkout does not support true micropayments as defined back in 2000, but is suitable for larger payments. I don't know anyone who uses it, though), and sits down with News Corp and all the other big media outfits until they all sign up to use that system exclusively. Because Google already has your data, you probably already have an account with it and most people trust it far more than they should, it is in a position to make its system ubiquitous and immediate in one fell swoop, and it has the funding to ride out the rest of the recession. It can also give preferential treatment to sites covered under its micropayment system, making them show up first in searches and embedding micropayments code into its search links so these sites perform better than non-micropayment sites. People will still be reluctant to use them for as long as the recession lasts, but they will be pressured into accepting them earlier than if any other party supplies the micropayment service (because they will be shut out of the best search results if they don't) and once they get more money into their pockets again, they will start embracing them.

In this scenario, Google leverages its power to gain even more power, and unlike in the Big Squeeze scenario, the big media win without having to raise their game for even a moment. The landscape changes irrevocably, to the advantage of parties that are already entrenched.

I'm not happy with both scenarios. The first one seems more likely right now than the second, as I recently read an essay (on a Dutch newspaper's blog, no less - but I unfortunately didn't take note of where it was and can't find it anymore) in which the writer recommended that newspapers shut down their websites entirely so they'd stop competing with their paper editions. But I'll be glad if neither come to pass and consider not having a good, viable micropayments system on the web to be a small price to pay for that.