There’s no question that Oracle’s lawsuit against Google will have a chilling effect on Java development. But will it have a similar effect on Oracle development? If it does, even to a slight extent, the suit could be a turning point for Oracle’s fortunes.
The Google Java lawsuit is a tricky business to begin with, as it appears on the surface that Google has licenses for everything it is doing with Java. If there is anything improper in what Google has done, it will have to be established by some tricky line-drawing from Oracle as it prosecutes its case. The significant thing about this is that it depends on legal distinctions that the average corporate information technology manager won’t understand.
One of the ironies of this case, from Google’s point of view, is that it will surely spend more money defending its case in court than it will cost to redo its entire project without any Java components. Indeed, it may end up having to pay for both — a waste, from a management bean-counting point of view. It’s the kind of waste that makes managers call other managers into conference rooms to ask, “Why were we using Java in the first place?” And of course, lots of software engineers are asking the same thing about Google at this point: “No, seriously. A major corporation that knows a few things about software was actually trying to develop a mission-critical project in Java?”
I’ve written elsewhere about the perils of using legacy software as a foundation for creating something new, so I don’t need to belabor that point here. The more interesting point has to do with the comparison between Java and Oracle’s big-money software, the Oracle database. Like Java, Oracle is basically useless by itself. The reason large corporations fork over billions of dollars for Oracle is to run their custom applications in the Oracle environment. In other words, their way of relating to Oracle is very similar to the way Google relates to Java.
And now Oracle is taking Google to court. Is this the beginning of a new business model at Oracle, in which it selectively files suits against its software users to try to collect more money? No one knows, of course, but if you are a manager making decisions about what software to spend tens of millions of dollars on, do you want to take that chance?
The automatic reaction to this situation is to say that all this doesn’t matter, that the Google situation isn’t going to affect any Oracle projects. And when projects that might have used Oracle go ahead with a different approach, the managers involved will say, it wouldn’t have made any sense to use Oracle on the project anyway. But the fact is, Oracle is used for lots of projects where it doesn’t make sense, just as engineers are scratching their heads trying to figure out why Google is using Java. If Oracle loses just a fraction of that segment of corporate IT, it will be a big monetary loss for the company.
And I don’t see any way to avoid that result. Some people have said there must have been ongoing negotiations between Oracle and Google, and Oracle decided to file a suit in order to add some intimidation to its negotiating position. The problem with intimidation, though, as Robert J. Ringer told us years ago, is that people will just go away. If you’re an intimidator, people start avoiding you. And for those who think this rule doesn’t apply in the corporate IT world, I can only point to the example of SCO, and how quickly it killed off Unix. In 2001, Unix was the standard by which operating systems were measured. It took just a few SCO lawsuits (along with thousands of cease-and-desist letters) and just a few years to find Unix not only dead, but discredited in court, shown to be mostly copied from its competitors. Let’s not forget that that episode is a big part of the reason why Oracle finds itself owning Java. Java was invented by Sun, which was essentially a Unix company. When Unix fell out of favor, Sun couldn’t find a way to make money. In the end, Sun was bought out by Oracle for less than the value of Java. Now, this pattern may be repeating itself.