free geoip On Microsoft and Backwards Compatibility: Windows 8 (and How to Make it More Usable) - Jayson's Blog

On Microsoft and Backwards Compatibility: Windows 8 (and How to Make it More Usable)

Note; If you don't feel like reading this entire post, at least read this: Start8 from Stardock Corp (30 day free trial, 5 bucks after that, and well worth it IMO): Get your start button back, disable hotspots, and boot directly to the desktop with that utility. Otherwise, read on.

This post is not going to be a diatribe either for or against Windows 8 (though I will give a brief opinion towards the the end of this post), but rather on Microsoft's stance on backwards compatibility vs. other operating system vendors.

Love or hate Microsoft, they bend over backwards to preserve backwards compatibility between OS releases. If you wanted to, you can still run DOS games on Windows 7. Code written on their old 16 bit systems will happily run on newer 32bit/64bit rigs. When they do a major OS release, they reach out to every vendor they can, and in many cases will help them patch their software packages to run on the new OS (why you might ask? Microsoft is nothing without 3rd party software vendors…people and corporations buy Windows so they can run the apps they need to be productive, and since Windows is ubiquitous on the desktop, everyone wins in the end). I can only think of a couple cases where Microsoft broke backwards compatibility in a major way (and there are probably more, but these are the ones I know of, and they are mainly developer-centric as that's what I do for a living):

  • The shift from 16bit to 32bit (they did NOT repeat this mistake during the move from 32bit to 64bit). The shift from Windows 3.x to Windows 95 and up was drastic to say the least, but they had no choice as the 16bit address space. If this doesn't make any sense, I'll put it like this: Using 16bit architecture allows the OS to address a maximum of 65MB of memory. Granted, back in the early 90's that was adequate, but not nearly enough towards the mid 90's. On a 32bit system, 4 gigs can be addressed (though due to limitations in XP/Vista/Win7, only 3.5 gigs is usable without some hacks), which these days still isn't enough for many business uses (I saturate 4 gigs easily on my rigs, and the 16 gigs on my main server isn't enough either). In the 64bit address space, you can theoretically address 16 exabytes (2^64), but due to the way chips operate, ~256 petabytes is the max. I don't see the move to 128bit coming any time in our lifetimes.
    • How did Microsoft not break backwards compatibility during the 32bit > 64bit shift? In techno jargon, it's called thunking. You can indeed run 16bit code on a 32bit machine via virtualization, but you cannot run 16bit code on a 64bit machine, it's just too complicated to get the mappings right. 64bit versions of Windows use a thunking layer called WoW (Windows on Windows). This guarantees that virtually all 32bit apps will run on a 64bit machine. For consumers, the ONLY reason to run a 64bit rig is for the extra memory support, it is not recommended to install native 64bit apps unless you have a very compelling reason to do so. Not sure if you are running 64bit? Fire up Task Manager and go to the processes tab: If you see an asterisk next to any of the processes, that indicates those apps are running in the 32bit (WoW) compatibility layer. Or, go to your system drive, and if you see a folder called Program Files (86), you are running 64bit.
  • Microsoft caused a huge furor amongst developers when they deprecated Visual Basic 6 in favor of Visual Basic .Net. This is probably the biggest break they've ever made as they provided no tools to port existing code, and VB 6 developers basically had to completely retool themselves for VB.Net as the two share nothing in common outside of a common syntax. Microsoft rolled the dice, and won though. The dust settled, the VB6 folks retooled, begrudgingly at times, and the programming space is MUCH better because of it. My friend Joel Spolsky disagrees though in his excellent article about how Microsoft lost the API war. The shift for me couldn't have come soon enough as I loathed VB6 and immediately jumped on C#.
  • Related to the above, Microsoft dropped ASP as their official framework for dynamic websites in favor of ASP.Net, though technically it's still supported by their web server, IIS. Who would want to still write classic ASP is beyond me, but there is probably some cranky programmer somewhere still clinging to their copy of Visual InterDev hammering out VBScript.
  • The move from the Windows 9x kernel to the NT kernel. This is by far the best thing Microsoft could have done for consumers (business users had the NT kernel from Windows NT 3.1 onwards). If you don't know what a kernel is, don't worry about it, just know that the NT kernel was the first fully 32bit kernel, the 9x kernels were 16bit/32bit hybrids, and were a total mess…remember all the BSOD's we used to get on 9x OS's?…NT is much more stable). At one point, MSFT was maintaining a slew of different kernels, which means they had to write apps targeting each kernel, each version of each kernel, and each version of apps like Office, a maintenance nightmare to say the least). Windows ME (arguably MSFT's worst product ever) was the last to use the 9x kernel, Windows XP (arguably one of their best consumer products ever) ditched the 9x kernel for NT. All OS's since are NT based, one kernel to rule them all (though now that RT tablets are out, they are back up to two kernels). Regardless, the jump to NT created a slew of problems for legacy application developers, but again in the end MSFT made the right move, and the consumer space is better for it.
  • The introduction of the ribbon in Office. I wouldn't necessarily call this a break in backwards compatibility as it was mainly a UX (User Experience) change, but it was met with heated resistance. I hate to say it, but menus are so 20th century at this point…cumbersome, ugly, and not intuitive. The biggest change for me was that they changed some common keyboard shortcuts in Outlook, but other than that, it only took me a few weeks to get used to the ribbon. Users better get used to it as this is now the standard UX on Windows 8.
  • There are other backwards compatibility breaks in the developer/business space, but those are beyond the scope and audience of this post.

Which brings me to Windows 8 (which IMO is just Windows 7.1). Again this is not meant to be a review or even that much of an opinion on Windows 8 as there are many things I love about the new OS, and a handful of things I abhor. Plus, there are plenty of reviews from professionals on the web already. I'll start with the good:

  • Resource usage is incredible compared to Windows 7. This is no doubt due to Microsoft ditching the Aero interface (which I thought I would miss initially, but I actually like the new flatter interface). By resource usage, I mean memory consumption, processor usage, hard drive saturation, etc. I haven't done any serious benchmarking, but memory usage is roughly half that of my Windows 7 machines, and processor time seems much better utilized (it's a dual core overclocked to 3.8ghz so that could be part of it). My Windows machine just feels faster, which means it is. Startup time is phenomenal, about 60 seconds to desktop…err, Metro (and this was an upgrade, not a clean install; I bet a clean install would be even faster). It should be noted that my main workstation is 64bit, so that could be part of it as well, but it just feels crazy fast.
  • The new task manager is fantastic. Seems a little silly, but I almost always have an instance of task manager open. You have to see it to believe it; task manager has remained largely unchanged since the mid 90's. It expresses information in a much more logical intuitive manner with a refreshed GUI.
  • You can run XBox games on Windows 8 machines (I don't know if this is fully implemented yet, but this is pretty cool).

And the bad, and these are huge for me and were almost deal breakers initially:

  • Windows Media Player no longer supports playing DVD's. Read that sentence again. I don't watch DVD's on my machine at home, but on long trips (planes, trains, etc.) I do. Granted 3rd party apps are of course still supported, but MSFT stated that the licensing costs for WMP codecs wasn't worth it for DVD support. Whatever.
  • Windows Media Center is now a paid add-on. My entire home theater is based around a WMC Virtual Machine which houses hundreds of digitized DVD's that I run through WMC.

And finally, the biggie:

METRO :-(

First off I do applaud what Microsoft is attempting to do with Metro: Present a unified UX across all of their platforms, in this case Windows, WinPhone, and XBox, and as a byproduct of this, a unified framework for developers such as myself, the intention being that we can write one codebase in the Metro languages, then deploy it to any device that supports the Metro framework. In theory at least, I haven't tried it yet.

I do realize that Metro is optimized for touch devices, and I have it installed on an old tablet…for touch, it's quite usable. But for the 95% of the population who don't have touch devices, the UX is miserable. One of the #1 rules of web design is no horizontal scrolling, ever. Guess what the Metro desktop, and almost all Metro style apps make you do? Scroll horizontally. For touch this is actually fine, but with keyboard/mouse, it's an awful experience. And, the Metro interface just looks silly on my 26" LCD (looks great on a phone and XBox though). The Windows store apps that I've played around with are dumbed down to the level of a kindergartener, and I haven't found a way to tile them so I can use more than one at a time (full screen only in Metro, really???). And, finding apps is a complete pain, you have to search for everything, which means you have to remember the name of the app you want. Bringing up the search "charm" (as it's called) does present a screen listing all of your apps…but it lists ALL of them with no way to collapse sections that I've found. It is nothing like the start menu that I've been using for almost 20 years, so it's a huge break in the UX department. Finding apps is an abysmal task at best, and fruitless at worst.

What really sucks about the new Metro "start/search menu" if you can even call it that, is that you must use it. The start button is gone with no native way to get it back (more on this at the end of the post), which is a major UX no-no. Microsoft is shoving the new Metro interface down our throats. There is also no way to boot directly to the desktop, you have to go through Metro first (hint, WinKey-D will get you to the desktop from Metro). I can see MSFT helpdesk costs going through the roof with calls like "where is the start button" and "how do I find my dancing bunnies application"…it is beyond me why they didn't provide these two legacy options. Just give me my start button/desktop back.

Once you are in desktop mode, all is still not so good. Each corner of the desktop brings up what Microsoft calls "Hot Spots"…when you drag your mouse to a corner, a little popup shows. Lower left is the hotspot for the Metro start menu (or just hit the Windows key), top left is a mini application switcher, and top/bottom right bring up the "charm bar" with the following options: Search, Share, Start, Devices and Settings. The hotspot locations for the charm bar are in the worst locations imaginable though. I work 95% of my time with a window maximized. Every time I want to exit out of a window (when I'm too lazy to alt-f4) via the X button…well, think about it: Where is the X button for a maximized window? Top left, so the charm bar pops up and even worse, it steals focus. It usually takes me a couple of attempts just to close a window. Bottom right corner is no better: When I don't feel like WinKey-D to minimize all open windows, I hit the minimize button which is located where? Bottom right corner. So again, attempting to do that brings up the charm bar, which steals focus. I have not found a way to natively disable hotspots, or at least move them to different corners.

From the desktop, hitting the Windows key, as expected, brings up the new start menu. But, the transition is visually jarring from desktop to Metro. I live and die via the search box in the Windows 7 start menu: I can bring up apps, type in simple commands…I probably hit the start button (or WinKey) a hundred times a day. Hitting Ctrl-C or WinKey a hundred times a day on Windows 8 would probably give me a seizure, or at least a horrible headache by the end of the day. Yes, it's that jarring.

Fortunately there are some workarounds. The first thing any Windows 8 power-user needs to do is read this article by Scott Hanselman on how to make Windows 8 much more usable via keyboard shortcuts: Windows 8 productivity. Again, I cannot stress this enough: Read it, memorize it, print it out and tack it to your cube wall, whatever it takes.

And finally, the tool that has cemented my move to Windows 8, and solves all of the UX problems I mentioned above: Start8 from Stardock Corp (30 day free trial, 5 bucks after that, and well worth it IMO). I never install 3rd party GUI enhancements/add-ons/themes because the inevitably screw something up (plus I do a lot of design work, need the GUI in its default state). Stardock is a very reputable company who has been making GUI enhancements for Windows since…well, since Windows had a GUI. At the risk of this sounding like an advertisement, this little tool is incredible. You can:

  • Get your start button back (it is 100% identical to the Windows 7 start button)
  • Boot directly to the desktop and bypass Metro completely
  • Disable all the hotspots

This tiny little tool makes Windows 8 completely usable. Is it silly to harp on about not having a start button? Yes. The start button is synonymous with Windows, it's the very definition of the Windows desktop. For people who make their living using Windows as their main tool, it'd be like the gov't making us switch to driving on the left hand side of the street, or keyboard makers switching to DVORAK instead of QWERTY, or mouse makers getting rid of right clicking. For me yes, it's on that scale, and for many other colleagues I know as well.

For me as a software developer, the transition to Windows 8 isn't optional, it's a requirement for my job. Another break MSFT made is that you cannot develop Metro apps on any OS except Windows 8/ Windows Server 2012. There is no Metro runtime/emulator for Windows 7 (which I find odd), no doubt so they can push Windows 8 license sales; makes fiscal sense, but at the expense of pissing off developers, and whoever makes the budget for IT departments. Before I found that little utility, I dreaded hopping on my Windows 8 machine, now I actually look forward to it. The point is that one tiny little change like removing the start button, and thus undoing over 20 years of UX look and feel can have a huge impact. Always give your users an option to revert during a transitional phase as big as this, like they did with the Ribbon (in the first version of Office that had the ribbon, you could still hit the alt button to get the legacy menus back, this was removed in Office 2010…at least the provided a transitional option though), then remove it Windows 9 if need be.

Overall, I like Windows 8, and I have high hopes for what MSFT is attempting to do with Metro. V1 (like most things Microsoft) will be slow to penetrate the market, but V2 should be better. I can see businesses that have employees out in the field adopting some tablet hardware and doing some Metro development in house for mobile-type apps, but cube dwellers will probably never see Windows 8 unless MSFT gives us some Group Policy options to boot to the desktop, and give us a native start button option. It would just be too costly to deploy and support, for absolutely zero gain. Even then, Windows 7 is good enough for businesses (and many are still on Vista, and even XP).

The rest of this post will be largely academic in nature, so unless you're really curious about backwards compatibility and how it affects the main 3 software market segments, you can safely stop reading this article unless you're bored, curious, brave, or need something to help you sleep.

Still here?

A little more about backwards compatibility and what it means in both the IT realm, outward facing websites (like Twitter, Facebook, Google), and the ISV (Independent Software Vendor) realm. First, a differentiation between the two (and there is a huge difference). When I tell people I am a software developer, or a web developer, they immediately think I write games, or applications like Word or Photoshop, or operating systems like Windows, or websites like I listed above. A high level definition of each of the above:

  • ISV's are companies like Adobe, or Symantec, or AutoDesk…there are literally thousands of ISV's (other examples are business service companies like SAP, PeopleSoft…companies that large organizations hire to implement custom CRM/ERM/HR solutions, this end of the ISV business is crazy lucrative, but extremely difficult to run and I know nothing about them). Basically, they are companies that write and sell software via licenses for different OS's. Adobe is a classic example of an ISV: They write creativity software for Windows and OS X that you install on your machine, and they charge a fortune for it. They make their money via software licenses (Microsoft is both an OS vendor and an ISV: They license Office and numerous other desktop apps, but they are also a services company). ISV's usually target a very narrow vertical market: They do one thing, and they do it exceptionally well by cornering (or creating) a market.
  • Outward facing websites like Twitter, Google, Facebook, MySpace are a relatively new addition to the software space. Most people don't understand how these companies work, and how they are profitable. Hell, I barely understand their business model. It should be noted that these companies are NOT software companies, they are advertising based companies that just happen to use software/websites to generate advertising based revenues, which is how they pay the bills. That does not mean these firms don't have some of the most brilliant minds on the planet working for them. These companies have legions of Data Scientists, Software Scientists, Stat Scientists, Mathematicians, Physicists, Chemists…you name it, they hire the best minds for a couple of reasons, in this case I'll use Google as an example:
    • Google uses its search service to mine user data and patterns, plain and simple. It's not as creepy as it sounds, all ad based sites do this. But, Google is an ad company first and foremost, they just happen to use the internet as a platform to dish these ads out to consumers. They make all of their money based on firms selling ads to them, which they then serve up around the web. The reason they turn a profit in such a small margin based market segment is because they have the best scientists working for them, creating algorithms that eek out the absolute best and most comprehensive data mining algorithms on the planet.
    • Google uses off the shelf components in their data centers. Stuff you or I can pick up from any web retailer. This saves them a ton of money. Plus, they are one of the greenest companies on the planet (Google "Google Datacenter Green" and read some of the articles), and they are transparent about their green processes, they give it all back. Datacenters consume enormous amounts of energy, so if they can eek out a couple pennies per kilowatt, they save millions of dollars per year. Hence why they have physicists, chemists, environmental engineers, etc. They don't make any money from being green, but they save a ton of money, and they give all the information they've learned back to the community.
    • I'll touch on Facebook briefly as they do make money from ads, but FB cornered a relatively new space on the internet: Selling the FB platform as a service itself. I don't know the exact stats, and yes they do make the bulk of their money from ad sales, but FB is as much of a platform as it is a social site. Think about all the crap-ville games and other apps that we constantly get nagged about. FB makes a fortune from developers who want to peddle their wares via the FB platform (don't know what percentage they take in revenues). But they are still ad based.
  • And finally, what I do for a living: Information Technology (the software side, not infrastructure). This is the hardest of the 3 to explain. First off, IT departments do not make profits for a company, ever. If anything, they are probably the most expensive cost center of any company, hence why when layoffs roll around, we are the first to go. Information technology is not a new field (it's been around since business itself), it's just one that has seen enormous growth due to computing advances, and cost decreases. All companies have an IT department, or at least have a 3rd party vendor they use, it's a necessary evil as data and information storage/flow/dissemination/analysis is the core lifeblood of any business. Easier/smarter/faster is the mantra of IT. The "core" core of a business is its data. This is similar to the web companies listed about: How can we analyze our data to make better business decisions to become more profitable. At its heart, this is what IT is: Gathering data from customers via employees, sending it down the pipeline, scrubbing it, manipulating it, and storing it for later analysis. We just happen to use software as a tool to input and analyze this data. Yes, I write software and websites, but what I really do all day is streamline data flow and processes, gather business requirements and translate them into software, build new systems to gather and analyze information, build automated systems to communicate with other business systems…it's all about information and data. We help the company either make more money or lose less money. Sometimes, that means automating a knowledge worker's job, but that saves the company money. How do IT departments make money then (IT workers are paid quite well)? Of course we're given a budget from the company "kitty" as I like to call it, but we also bill other departments that we provide services for. They end up saving money in the end, but IT managers (good ones at least) are constantly drumming up business for their departments. This may sound odd to folks not familiar with the "business" end of IT, but in medium to large companies, most departments A) don't know what IT is B) don't know how/what/why they need processes streamlined (IT managers need to sell this) and C) if they do know about IT, have no idea how to get in touch with them. So we're almost like a mini consulting firm within a company, and we get paid off of selling our services to other departments. It is also (at times) extremely layered and beaurocratic. An IT dept. (at its most heavily layered) consists of the following roles:
    • The CIO: The big boss, shot caller, decision maker.
    • Dept. Director: Drives sales and drums up business for the cube dwellers.
    • Systems architect: In charge of overall systems design, application architecture (usually not a supervisory role though, but is usually the most Sr. person on the software team).
    • Database administrators: They hold the keys to the castle, the company's databases. Extremely well paid, and extremely high stress. If the database goes down, the company is hemorrhaging money until it's back up.
    • Software developers (my role): We coordinate with architects, project managers, business analysts, SME's (end users who are experts in their dept.'s), QA/testers, and technical writers. We wear a lot of different hats.
    • Project Managers: Usually not very technical, they drive the project schedule/deliverables/business meetings.
    • Business Analysts: Fairly technical, they help translate business speak into technical jargon (the good ones at least). They serve as a layer between the geeks and the business folks.
    • Technical Writers: In charge of writing technical manuals to help users learn the systems the developers write (these are some of the smartest people I've ever met by the way).
    • QA: These are the folks that developers go to (friendly) battles with day in and day out. They try to break our code, then tell us to fix it. They are the last step before a system is deployed.
    • Support Staff: The stuff we write will break at times, end users call these folks to help get it fixed.
    • Infrastructure Dept.: I have no idea what these guys actually do TBH, but the corporate LAN would not work without them. Very high stress role, and we constantly coordinate with them during deployments.

That was a huge digression, but backwards compatibility affects each type of those segments differently. I briefly touch on how (I promise this will be brief) each type of business approaches backwards compatibility, and how it affects their bottom line. There are two types of backwards compatibility breaks:

  • Regressions, e.g. ones that are accidental. These are inevitable in any system, hence why QA exists. They can't catch everything though. Regressions must be fixed, hence why hotfixes and patches exist.
  • On purpose, such as discussed above (and businesses do this for one of the following reasons):
    1. Time to move on and ditch some legacy code or feature(s).
    2. Ditch an old feature in favor of a new one, usually to increase license sales (AutoDesk and Adobe are notorious for this to drum up new money). This isn't a bad thing per se, but it is annoying. In the case of AutoDesk, they will actually change file formats so that new files aren't compatible with old versions, and vice versa. Microsoft did this during the office 2010 release (moved from .doc to docx, xls to xlsx, etc.…but they maintained backwards compatibility).
    3. Scrap an old product completely and build a new one. Rare, but it happens.

IT is notoriously lax when it comes to testing backwards compatibility for one main reason: Time is money. Departments want their solutions yesterday, and don't understand testing. This is changing, but slowly. The irony is that businesses have the most to lose via regressions or compatibility breaks: If a LOB (line of business)app breaks, you have users sitting on their hands doing nothing, losing money. The good thing is that repairs are usually fast since IT sits in the business itself, and patches can be rolled out swiftly. IT loses money as well as we don't bill for our own mistakes of course.

ISV's: Might piss some customers off, but if the tool is essential to their job (Office, AutoCAD, Photoshop) users have no choice but to wait for a fix. They can usually go about their jobs though, not saying it's not a big deal, and patches are usually much slower to be released than IT, but ISV's have the upper hand.

Outward facing websites: This is where it gets interesting. In the case of Google, Twitter, Facebook (all free, ad-based sites) these companies lose no money if a regression is introduces, or a feature changes/is scrapped (TimeLine anyone?)…users get what they pay for, you are at the mercy of each company and what they want to do. Granted they may introduce a regression on their backend systems and lose money, but we'll never know if they do. All we want to do is tweet, share kitten pictures, and post dumb status updates/play our games. Sure if we get the fail whale for more than 30 seconds we might start cursing, but in the end it doesn't matter. Of course there are service based websites out there that users do pay for, so if they go down or have regressions, it's a big deal, but we're still at the mercy of the company behind the service. Service based companies will usually pro-rate outages/errors from your bill if it is substantial enough.

 

Finally I want to touch on a company that is notorious for just flat out disregarding backwards compatibility altogether, blatantly at times (and I'll also touch on why they can get away with it): Apple. It should be noted that I actually like Apple; their hardware is amazing, OS X is nice, and iOS is the most intuitive mobile OS I've ever used (and I've used a lot). A brief history of Apple, and my experience with their hardware/software.

I grew up on Apple hardware. My first computing experience was in 2nd grade on an Apple II. It goes without saying I was hooked immediately. I graduated on to OS7 (our first home PC was a Mac LC II) in my early teens, got my own Mac when I left home (OS8 and OS9), then jumped ship for the PC world when I went to college. I've barely touched Macs since then.

A misnomer about Apple that needs to be cleared up: They are not, nor have they ever been, a software company. They are a hardware company that bundles proprietary software with their hardware. Everything about Apple is proprietary (and I'm not complaining, but all the Apple kids love to rail on MSFT for being "closed"…Apple puts MSFT to shame in the closed off department): Their OS, the language they use to program their OS (objective-c), hardware specs/compatibility. And, they have made some of the largest most blatant compatibility breaks I've ever seen:

  • After OS 9 (and the return of Steve Jobs to Apple), they scrapped their entire OS codebase in favor of OS X, which is a Unix variant (BSD and the Mach kernel) but not open source. There was zero backwards compatibility provided by Apple, which meant you had to buy all of your applications all over again. That also meant that every single software vendor that targeted pre-OS X had to completely rewrite their applications from the ground up, none of the API's were preserved. And, they had to learn an entirely new programming language to boot: objective-c. This would be the equivalent of Microsoft scrapping the Win32 API, upon which every single software program runs off of.  If MSFT did this, retribution would be swift and severe…hence why they could never, ever do it. The user base wouldn't stand for it (more on that later). Apple certainly didn't make any friends during this process, but eventually the ISV's adjusted and settled in. They did provide a very thin compatibility layer called "Classic" (which was far from perfect) but this was short lived because…
  • A few years ago, Apple made the switch from their proprietary PowerPC chips to off the shelf Intel chips. Fiscally this made sense, much much cheaper in the end. However, the CPU is the heart of the computer: Changing chip architecture is the equivalent to open heart surgery. Reaction notes here, and this officially ended "Classic" support, and any software than ran on PowerPC chips. One good thing that came of this was the ability to run Windows in an emulator since Windows runs natively on Intel chips. Also, it turned out that the Intel chips outperformed the PowerPC chips by a huge margin. I do not think Apple could have written a compatibility layer for this transition, at least not one that performed well, but in the span of a couple years they had forced the major vendors to overhaul their apps…twice. Those firms have to recoup their development costs, at the expense of end users' pocketbooks.
  • Subsequent releases of OS X are notorious for breaking 3rd party software, and unlike Microsoft, Apple does not provide much assistance in patching those programs to run correctly, or providing backwards compatibility libraries in the OS itself. They just don't care. Also, for some releases, users had to buy all new hardware to run it. Yes I know they are in the business of making money, but to hard code hardware requirements in the OS itself is ridiculous. MSFT does this sometimes, but it's easy to bypass if you're willing to pay the performance penalty.
  • Cable interface changes. One thing about USB is that all 3 USB ports use the same cable, and are compatible with each other (and the specs are public). Apple recently came out with an upgrade to their proprietary firewire interface called thunderbolt. Want an adapter? Cough up 30 bucks. Apple/Intel are touting it's 10gb (little b)/s speed, which means absolutely nothing in the real world of computing, it's like the megapixel myth in digital photography. A computer is only as fast as it's slowest component, which will almost always be the disk drive. Even the fastest consumer solid state drives peak around 300MB/s, and spindle drives are less than 100mb/s at standard rotation speeds (5.4k and 7.2k). Not nearly enough to saturate thunderbolt's line speed. If you could go directly from memory to a thunderbolt device (and vice versa) it would make sense. Otherwise, it's rubbish and a cheap marketing tactic meant to sell cables and new peripherals.
  • And their most recent backwards compatibility gaffe: iPhone 5 has ditched the standard connector for a new smaller one (fine, it allows the phone to be thinner). This basically renders all your docking stations, connectors, etc. into paperweights, unless you are willing to cough up 30 bucks per cable, or just buy new peripherals. At least include a free adapter in the box Apple. If I were a tin foil hat toting type person, I'd swear Apple was in bed with the peripheral vendors and getting kickbacks.

So to wrap this up (and I hate to use Microsoft vs. Apple as my example to follow), but I'll try to be as subjective as possible. Microsoft and Apple aren't really competitors except in the mobile space, this is purely one school of backwards compatibility vs. another, and why I think each company are polar opposites on how they go about it.

  • Microsoft does everything they can to ensure backwards compatibility in their software, up to and including helping vendors rewrite drivers and software to be compatible with new releases. But, why?
    • Simple: because they have to. I do honestly believe it is part of their company culture, but when stuff breaks, their customers are losing money (the overwhelming bulk of Microsoft's profits are from the business space, not consumer). Plus, businesses and IT now expect this from Microsoft, it's not an option. When MSFT has chosen to break backwards compatibility, they are extremely clairvoyant about it, they work with customers, and usually offer tools to bridge the gap (and they are almost always free).
    • Microsoft owns 95% of the computing space. If something breaks or regresses, it affects 10's of millions (if not more) of people. I do want to make something clear: I am not a Microsoft fanboy, though at times I may come off as one. They provide a good set of tools that allow me to get my job done, their software is easy to use, and let's face it, if you are in business, you have no choice but to use Microsoft products. What you choose to use at home is your choice.
    • In the end, it benefits Microsoft, and their vendor relationships. It's just good business practice to go the extra mile.
    • The downside is this leads to longer release cycles, increased cost of products, and bloated software with tons of legacy hacks in them.
  • Apple plays nilly-willy and loosey-goosey with backwards compatibility (I know that sounds pejorative, but this isn't a bad thing). Why?
    • Because they can, plain and simple, for two main reasons: Apple has something Microsoft will never have as much of: Customer loyalty. The Apple brand is up there with Disney, Coca-Cola, and Joe Camel (when he existed). Apple has built a legion of loyal customers (I'll be honest, I'm beyond loyal to the iPhone line and don't see myself switching pretty much ever). Their hardware is awesome, and since their hardware is a closed eco-system, it is guaranteed to work. Number two: Small market segment in the computing space (but they pretty much own the mobile space, this post is about computing though), generous numbers put it at 5%, in the consumer space, and less than 1% in the business space (basically non-existent). Small market share + all of them consumers + who are beyond loyal = can break backwards compatibility. They may bitch initially, but Apple knows they'll jeep coming back and dropping dough on their hardware.
    • Marketing. Apple is a marketing machine, and consumers are much more likely to react to marketing than businesses are, That's Apple's core market, and wow can they market the hell out of their hardware. I have met many technical Mac users, but the majority are not (and I'm not dissing Mac fans, these are just the facts) so when they see things like THUNDERBOLT, 10000000000gb/s TRANSFER, they buy into the hype and upgrade. Again there is nothing wrong with this, that's what marketing is all about, creating demand, make customers drool, and Apple is one of the best in the business, if not the best.
    • So what are the benefits of breaking backwards compatibility at will? Apple is one of the most innovative companies in the world because of it. They can play around, and stay bleeding edge without fear of much retribution. If vendors and consumers have hung in this long, with all the breaks they've done, they aren't going anywhere. That doesn't mean Apple gets free reign, but they know they can stretch their limits. The consumer (as long as they have the money) wins in the end. Microsoft will never have this capability since they have to kowtow mainly to the business space, who don't care about bleeding edge, they want rock solid results.

I prefer the maintain backwards compatibility line of thinking, and in what I do for a living it's fairly easy to do since web standards are generally retro-active with one another. Regressions are impossible to avoid, and very costly in my business. In the end, you have to find a balance of what your customers expect, what are they willing to put up with, how can we still be productive and turn a profit at the same time, and is purposely breaking something worth it if we might lose customers because of it.

I often wonder if Microsoft will ever pull an Apple and ditch the Win32 API. Sometimes I wish they would because it's ancient, written in C and Assembler, and probably has more cruft in it than my grandmother's basement. But I doubt it. Hardware keeps getting faster and cheaper, so they can just keep writing compatibility layers on top of it without worrying too much about performance hits. Will they ever ditch the NT kernel? Not anytime soon. Even the staunchest of Microsoft hating Computer Science students (who all come out of school praising *nix and how much MSFT sucks, then learn the business end of computing) will agree that the NT kernel is a pretty amazing feat of engineering. Regardless, backwards compatibility is an interesting topic. If you've made it this far, many thanks for reading.

 

Addendum: A little trivia about the Microsoft/Apple relationship. Microsoft would not exist had it not been for Apple. Microsoft made their first applications for the Apple platform for many years before writing DOS. Had Apple not been around to sell licenses to, who knows what would have happened.