Today, I'd like to discuss a more philosophical and controversial issue. Let me start with some background information. At home I am using a Sony Digital Camcorder, a Panasonic VCR, a Canon digital camera, a Philips TV, a Pioneer DVD system, and a Fujitsu-Siemens Activy Media Center. "So what?" you will think, "this is nothing special". And, of course, you are absolutely right. I chose in each area the device I heard would be great for a specific task. We might call this a best-of-breed approach. Most will agree that this is fine as long as the devices meet my requirements. Sure, it is not only the devices as stand alone gadgets that matters but also the interaction. For example, some VCRs and TVs have a special link interface to directly synchronize their channel settings. Great, but I don't care. On the other hand, it would be inacceptable if I could not play videos in the VCR and display them on my TV screen. Audio/Video interoperability is definitely a must. You still don't understand what I am talking about? So let me first give you some additional details: I am using Linux and Windows, and I am working with Java and .NET. I am editor-in-chief of a German Java magazine called "JavaSPEKTRUM" and at the same time I am appointed Microsoft MVP (Most Valuable Professional). Can you see the problem now? Some people really ask me how I could sell my soul to Microsoft, while others are wondering why anyone would ever use Java instead of Microsoft technologies. And that is the lesson learned: In the area of computer/software technology things are considered in almost religious dimensions. There may be similar discussions in some other areas too (e.g., IPod versus other MP3 players, BMW versus Mercedes Benz, Stallone versus Schwarzenegger, ...), but IT people seem to be much more involved in those stupid religious battles. I don't like these useless discussions as they are only a waste of time. But often it is hard to remain uninvolved. To be honest, I am working for Siemens which is one of the biggest companies in the world (440.000 employees). In our products you'll probably find all kinds of operating systems, programming languages, platforms and these products are developed using various engineering processes and different configuration management systems, etc. Do you see the point? For the consumer it does not matter what IT technologies we use. Every thing is fine when we can satisfy all requirements. Technology does matter, but architecture does even more. I am taking a rather pragmatic approach: Use all technologies that help to meet your requirements. First collect your requirements and use cases. Then build a baseline architecture. In the third step decide which technologies and tools to use for implementation. I have been involved in projects where the technology decisions came first. Interestingly enough, most of these projects failed either partially or completely. Sometimes, it would be really helpful to compare this technology-first approach with other areas. Want to cut a piece of wood into two halfs? Of course, you should use a screwdriver! Want to open your PC? Why not use a hammer for this problem? No person that is not completely insane would ever consider to solve the aforementioned problems with the stupid solutions I proposed. It is obvious that we should use tools and technologies only for what they were built for. Why should software development be so different? I love Java and .NET, and I stll love C++and CORBA because those technologies are extremely cool, each of them for a specific range of problem contexts. And what if there is a problem where we could use two or even all of them? That is even cooler because we got a choice.