I am not an anti-Microsoft person, but after having used Linux now without even Windows installed for two years straight, it is actually mentally painful to use Windows.
Yeah, use whatever works best for you -- but using Windows these days feels like I'm on my Big Wheel when I was three years old. In a rubber room.
I've never met someone who used any version of windows because they thought it was a reasonable way to interface with your computer, always for application compatibility.
I do. I like Windows a lot. I'm completely platform agnostic and I've used all 3 major platforms at various times but I always come back to Windows.
I think there's a quiet majority of people out there who are perfectly happy with Windows but they don't tend to talk about it, whereas you do tend to get a lot of outspoken Mac and Linux users who are willing to both champion their platform as well as bash Windows at any given opportunity.
> I think there's a quiet majority of people out there who are perfectly happy with Windows
I think there's a quiet majority out there who's never tried anything but Windows, and think of their computer as nothing more than a tool, completely oblivious to such abstractions as operating systems.
I've used Windows pretty much all my life, but started out with Linux (Ubuntu, briefly openSUSe and back to Ubuntu again) a couple of months back.
I don't see many problems with Windows apart from it doesn't like moving User documents/data to another drive. The resulting clutter from that is all down to me though. I haven't had an complaints with Vista.
I'm warming more to Ubuntu now (although I only use it as a development platform on a laptop, I'd probably prefer it even more on the desktop) but I really don't see the "horrors" in Windows.
Blue screen of death at an inoppertune moment - then what? If the equivalent happens in Linux you put in a live CD and get that all important presentation off of it and go on your way. Not to mention that I can install Ubuntu freshly in under and hour, windows takes several times that by the time you install 4 drivers and reboot after each one, install a half dozen apps that aren't included by default etc.
That was pretty much what sold me, I was 30 something hours into a Solidworks project that forced me to use Windows, and the machine BSOD me at 4am the day it was due. If you rely on a machine you have to have faith it will work during crunch time.
You can put a Linux Live CD in a Windows computer and still get at your data, and I haven't seen Windows crash that often recently.
I'm using Mac and sometimes Linux and have seen Mac OS crash more often that Windows.
I feel more comfortable in Mac OS, though. I've used Ubuntu a few years back and prefer that over Windows as well.
Agreed, thats what I did, but at the end of the day now I have my unfinished project on a flash drive and no functioning computer to fix it - not all that useful unless you have another Windows machine with Solidworks installed ready to go - and at a few thousand bucks a seat not many people do. I don't think Windows crashes any more frequently than Linux but it is more catastrophic I feel.
I don't think there is anything noobish or wrong in thinking of your computer as just a tool (not that you said that directly, but it could be inferenced). What is it, in the end? A tool for communication, development, research, whatever.
Some people just resist any kind of change - they're used to Windows, they aren't going to try anything else, because they think (and they might be right in theory) that they don't need anything else.
I installed Ubuntu on a laptop I gave to my father a few weeks ago, and he's been using it exclusively. I don't think he can tell the difference between Ubuntu and Windows.
You know, I'm not one of those weirdos that think desktop applications are going away. But I'd say most people who keep up with software development believe that's the case. <rant>I think that would be pretty bad if the best we'll ever be able to do is Javascript.</rant> Anyway, IF most apps people are using are Web based and compatible with Firefox or a Flash version supported in Ubuntu ... it seems less and less like there would be much point in paying for, using, and dealing with Windows.
I think most seniors can just about already switch fine to Ubuntu.
I suffer every time I'm faced with Yet Another *nix Package Manager, another mess of /usr /usr/local /opt /var inconsistencies as well as every time I have to remote to a Windows machine and can't use SSH.
There is an alternative, but it's not all sunshine and green grass either side.
Suffering? It runs more than one application at once. You don't get viruses if you keep it up to date (unless you're an idiot). Copy and Paste works. There is nothing else out there that I need an OS to do for me. Everything else is in application land, where Mac and Linux sometimes fall flat.
I use Linux/Mac/Windows every day and have no trouble switching between them. If you are suffering under either Mac or Windows, it's your fault, not the OS's fault. If you're suffering under Linux, it's because you aren't enough of a moron to be a full time computer geek like me.
I wouldn't even say that. In most cases I'd say it's all about familiarity - both OS and application.
Perhaps a project manager needs MS Project, or an architect needs AutoCad or something but for the "teenage girl" demographic you would be hard pressed to find any particular application they absolutely needed that doesn't exist on Linux, or there isn't a ready counterpart (eg Adium instead of MSN Messenger). They're just not used to it and dislike anything unfamiliar, since computers are basically mystic black boxes anyway.
I think for many typical users it's simply a matter of preferring something they (mostly) know how to use.
It's important to use would instead of could when thinking about these things. If the teenage girl doesn't know about Adium, it's the same no Adium.
It's now more likely that she would Google "MSN for linux" and be successful. That's relatively new. It still leaves her less likely to having the messenger. That likelihood is not 0. Not even on windows. There are plenty of 14 year old girls out there that don't have MSN Messenger on Windows even though they 'couldn't' overcome some sort of hurdle.
I use W/M/L on a daily basis. User of all, fanboy of none. No platform strikes a perfect balance for me.
Windows accelerators are pretty good; far and beyond OS X, in my non-expert experience with the Mac. I can't say much for Linux because it's pretty much all terminal :-)
Oh, the Mac mouse acceleration annoys me to no end.
More edits. Actually I can't say that about Linux. It's terminal because it complements my useless cmd.exe in a VM. The Linux accelerators are about the same I guess.
He's not just talking about the GUI though. He's talking about the actual usage of the machine. Personally I find Windows to be a complete pain to use. The console is weak, the way files are laid out on the system irritates me, and their politics as a whole stinks.
This was true back in the days of cmd.exe, but with powershell the state of the windows command line is a different story.
Powershell is a huge step forward for Microsoft. I've been using it for about a year now and so far I'm very impressed. I'd even go so far as to say I prefer it to bash for certain tasks. The concept of passing objects around is, while strange at first, surprisingly intuitive and powerful.
If you're currently using a combination of windows + cygwin (as I was) I'd strongly suggest picking up a copy of "Powershell in Action" and giving powershell a shot for a couple of weeks. I'd be surprised if you don't switch to powershell.
I'll concede that overall powershell still has some significant obstacles to overcome before it's better than the linux command line, but just like when windows xp came out and people stopped making legitimate complaints about window's stability - with powershell released now, I think we'll start to see the same trend with people complaining about the windows command line - these complaints will no longer be legitimate.
Exactly. Powershell is some improvement, but still a pale imitation.
Windows feels fragile, and often is when you try to do anything difficult. How to bring an ethernet adapter down then up with a script in Windows? Very difficult, WMI calls and such.
In Linux? A line (really, a few characters) of script. Just one of the differences I've run into recently.
>And if I picked a Linux distro from 8 years ago I'd find some bugs in it too. Use the current version of Windows, it has fewer bugs.
Perhaps that last bit should read "Use the current version of Windows, it has fewer known bugs."
As software gets larger and more complex the tendency is for more bugs to be introduced. Open Source mitigates this problem by allowing everyone to inspect and fix it.
By that logic, if your company provided Windows 3.1 you would go around saying Windows has no multiuser support.
Windows has an interface for doing what you asked. It had a bug / didn't work as described so they fixed it. Your company wont buy the fix, but that doesn't mean the fix doesn't exist. It means you can't take advantage of it, sure.
But saying Windows doesn't support it is deliberately misrepresenting Windows to make it look bad. It's spreading FUD, and it's wrong.
Plus, why did you bring WMI into this? You can't seem to do it with WMI: http://msdn.microsoft.com/en-us/library/aa394595(VS.85).aspx "If you are not using DHCP, you cannot use WMI to disable a network connection". Were you just bringing WMI into it because it sounds scary and complicated and so would help shore up your FUD, or is there a way and this MSDN article is wrong?
You just said something monumental "Your company wont buy the fix, but that doesn't mean the fix doesn't exist. It means you can't take advantage of it, sure."
Thats one of the biggest things that Linux has going for it, if something breaks, YOU DON'T HAVE TO BUY THE FIX!
You can A. Fix it yourself; B. Get the latest source which probably has the fix (if it's a problem anyone else is having); or C. Filing a bug report and have the community fix it usually within a day, or even hours or minutes for really simple fixes.
Sure, Linux isn't for everyone, but for corporations and small businesses looking to save money, it looks better every day!
FOSS has many benefits, but I had always considered the fact that you can "fix it yourself" not to be one. Do any of you fix random bugs in your desktop applications? (Firefox, for example).
It might not be that real of an option for an average individual, but fixing it yourself in a large business could save you millions, depending on how big your IT Infrastructure is.
Think of this example. Company ABC has Windows XP deployed throughout their company. They need to upgrade some hardware, but the new hardware only has drivers for Windows 7 and Vista. Their only option (assuming a lot) is to upgrade to Windows 7 or Vista.
Let's say that they have 1,500 desktops that need to be upgraded. I'm not aware of the details of VLK license prices, but let's say this will cost $150/workstation, including labor to install it. Roughly approximating, this comes out to at least $225,000. This doesn't even count the amount of money they will spend battling incompatible software and hardware issues, after the upgrade.
--------------------------------
Company XYZ is using Linux Kernel 2.6.17, they have a problem with a piece of hardware that only has drivers that work with the 2.6.28 kernel. Company XYZ is a software development company, so they can just use their internal software developers to backport the driver from the 2.6.28 kernel to 2.6.17. They are already paying the developers a salary, so it won't cost them any more cash, maybe just a bit of productivity on other projects. This amounts to a huge savings over what Company ABC had to spend to fix a similar issue.
So wait, the thing I paid hundreds and hundreds of dollars for over the year won't give me the new fixes unless I fork over more money, but the free one will?
I can see not getting new features put into XP, but fixing BUGS should be free.
I didn't pay for bugs, I paid for software. If it's buggy and they don't want to support it, they should sell it at a discount. If they charge me for a complete product, but the product is buggy, they owe me more software.
It took me hours of research to do a simple down/up in Windows -- something that took me 30 seconds of research in Linux.
And you yourself illustrate all the odd and very long things you have to do in Windows to do a simple down/up of the NIC, and to get it to work correctly.
While in Linux, I can just do "sudo ifdown eth0"
All in all, I prefer the 30 second way, versus the five hour way.
By your very use of evidence, you proved my point. Cool.
You're still claiming that it took you hours of research to do in windows without acknowledging that you're using an ancient version and that it is a one line command in current versions.
I could counter that in the scope of all possible operating systems there's no reason disabling a network adapter should be a simple thing and that you're picking on it because you considered it an easy thing in Linux' favour.
I could counter how much better it is to type "tracert" on windows than "traceroute" on linux, or that windows PPTP setup wizard blows Linux out of the water, or that there are uncountable things I've spent hours reaearching on bob platforms because they aren't as simple as I want them to be.
Or I could say "yoruban wax potato parachute"; I suspect any of the preceeding would result in further argument about how I'm proving your point.
I set out to counter your claim that x is too difficult in windows, but this further back and forth - it's not going anywhere, is it?
Yes, I agree that the console is not the main thing that one gets Windows or Mac. My rule of thumb is: less command line use-> better.
Ubuntu is pretty nice and I like their efforts to provide good human interface.
On the other hand there whole bunch of software that runs only on Unix machines and not on Windows. I am irritated every time I need to ssh to a linux machine just to do something simple for a class or project, but I stick to Windows for my personal needs, because I find its interface best for me.
For hybrid tablet users, Windows is the only way to go, and in that respect, I think the MS team did a pretty damn good job with the bundled functions in Vista.
And those who haven't used tablets before don't know what you're missing out on :-)
I just finished Jaunty; the out-of-the-box support was remarkable, but it still falls short of Vista.
Vista is really a "tablet-aware" OS; when you pick up the pen, all the penabled functions come alive. When you hold down the secondary button, an indicator appears. And you have the screen rotation buttons, on screen keyboard, excellent handwriting recognition, and the journal app, which, despite it's lame (non-svg) format, does a decent job.
Jaunty was awesome up until the point of dual head. My video card is partly to blame, because on xorg I can't set two screens of different resolution, so my second monitor looks terrible, not to mention the logout required. In terms of tablet functions, you have to readjust the coordinates after you add the second monitor. On Vista, everything works with close to no intervention, and plugging in another monitor, everything still works as you expect.
That's not to say Jaunty wasn't hella impressive; it was. But I'm talking about built-in features, and in this respect Vista is still ahead. Anyhow, back to ubuntu on the VM.
Can someone with knowledge of the underlying OS mechanisms answer this for me: What is the deal with file access on Windows? In Linux (on the Mac too, I think) I can move, copy and delete files freely, regardless of what applications currently have that file open. On Windows, I always run into some situation where it won't let me do what I want to do because the file is open. So I have to hunt through 10-20 windows to see who has it (9 times out of 10 its a command prompt or explorer window holding on to a directory that I'm trying to delete.)
I don't use windows often so maybe I'm exaggerating, but this is the single most infuriating windows usability nitpick I have. Have they fixed this in Vista or 7?
This happens because of underlying implementation of file systems. UNIX-based systems have the notion of "linking/unlinking" files: a file is a link to a set of sectors on your hard drive. One can create many more links to the same data (using ln command). Deleting is the same as unlinking, i.e. removing a link. When the last link disappears, the file is considered "deleted". When an application opens a file, it creates yet another link to it, so nothing stops you from removing the link - the file isn't deleted yet since your app still uses its own.
I'm not sure how Windows can be "fixed" in this regard, since their filesystems don't have the notion of hard linking nor does Win32 API: there is no way you can move/delete file on Win32 while keeping existing open file handles alive.
However, the biggest issue with Windows, in my opinion, is Registry. Nearly all "popular" Windows gripes can be tracked down to registry abuse: decreasing performance over time, adware/spyware, viruses, etc... Registry is like a secondary file system, but much slower, more primitive, less secure and, most importantly, nearly completely hidden from the users: there are no tools with exception of primitive regedit, to deal with it.
Honestly, the registry is no worse from an end-user POV than the /etc filesystem under Linux. In point of fact, given the diversity of file formats and sources in /etc, it's probably fair to say that the registry is equally navigable and understandable to the average Windows power user as the config tree is to a Linux fan. Because keys are typed and support ACLs, it's also arguably better-protected against accidental or intentional breakage.
If you want to attack an ad-hoc config system, then perhaps talk about the mess that is OS X 'defaults', or GConf's half-assed reimplementation of the Windows registry for GNOME.
That being said, I am most certainly not a Windows partisan. The last time I worked primarily on a Windows system was when I worked for a large hardware manufacturer, and the entire company ran on Exchange + Office. That was over five years ago, and I have been almost 100% on Mac OS X + Linux in the meantime, with only occasional use of Windows in a VM to verify IE compatibility.
The reason is this: for the tools I need to be productive (Vim, Ruby, Java, Firefox, MySQL, IM client, email client) Windows is at best equally capable, and at worst a second-class host, to Linux. OS X is about as suitable a platform for those tools, but has a much smoother media (esp. online video) and mobile story. Hence, the advantages of Windows (huge application library, obscure HW compat) are largely moot, while its disadvantages (poor POSIX/UNIX compat., weird dev tools, cost) are more apparent.
In my Linux days I'd just make a copy of /etc and my home directory and that was always enough to get a new machine, install Debian, run a few aptitude commands, restore /etc and move on.
Try doing it on Windows. There is always a small army of GUI tools, often from 3rd parties, to help you accomplish things like these. Just recently I had to get rid of Adobe CS3 on my Mac: it was easy enough - just run find | grep on your / and you're in good shape. On Windows, big part of the game is played in the registry, and good luck cleaning it up from something as massive and badly written as Adobe software.
Ask an average Microsoft SQL Server user: "WHAT IS MS SQL SERVER?" I mean in terms of files and configuration data that you're putting on your machine when you're installing one. I bet very few people know precisely what it is. MS SQL, just like any piece of reasonably big Windows software, is a complex mesh of files scattered all over your hard drive, hooked up together with a few hundred registry entries scattered all over your registry, and replicating your SQL configuration on another machine is pretty much impossible without yet another complex Windows GUI tool. Internet Explorer is also like that: there are all sorts of hooks and back doors in the registry that you can stick your DLL into, to be loaded and considered a part of MSIE the browser, so an average user, without additional GUI guidance, has no chance of figuring out where all these popups are coming from, hence the need (and a big market) of various spyware/adware removal tools. On Linux/Mac these tools make no sense: a 10 line bash script would accomplish all they do. The reason? It's the registry, the biggest engineering fuck up in the history of Windows.
This is why I hate seeing Gnome moving in that direction - their Gnome Conf is a reincarnation of the same "wonderful" idea: to badly re-implement a file system in sake of elusive "centralized configuration storage" advantage. There isn't any advantage in centralizing your config in some non-standard complex format: you aren't gaining anything, you're loosing a huge army of tools and techniques and people's knowledge which standard file systems come with: just use them, store you config files as files in a file system, it's freaking great at storing files.
> In my Linux days I'd just make a copy of /etc and my home directory and that was always enough to get a new machine, install Debian, run a few aptitude commands, restore /etc and move on.
Even better: go make a VCS-repo for your /etc. It's absolutely great.
I have /etc in a git repo, and an hourly cron job to commit whatever there is to be committed (I should probably use icrod instead of cron, but right now, I'm too lazy.) That way you can fix mistakes that got introduced sometime in the past, and you can see the diffs between different versions of your configuration. Of course, it has the same benefit of being able to just clone the configuration on another machine.
EXCEPT, you also get to set up different branches for different hosts, if you need that, but that might be overdoing it already.
Seriously, try keeping more stuff under VCS. I use it for my ~/, my most important dotfiles, and /etc. Great thing.
The main reason I hate the registry and much prefer /etc is that I can do `tar -czf backup.tar.gz /etc/*` and have a backup. I can also fix my machine if things are so bad I have to boot into single user mode. Scripts can easily look at system settings. On the plus side, the registry is more consistent; you don't have to figure out each program's config language.
I think you can export pieces of the registry, and if Windows had better command line tools, scripts could easily look at settings. But as it is now, it's a lot easier to backup and restore /etc than the registry.
Windows has had hard links for a decade or so. Vista even ships with "mklink" which can make three different kinds of links. Before that, you could download utilities to make them. Windows won't unlink a link that is in use because its normal filesystem API promises that behavior. You can use different APIs to get different behavior.
The registry exists in the format it is in to boost performance, not hinder performance. It supports approximately the same ACLs as the filesystem so there is nothing that makes it inherently less secure than the filesystem. And, the registry is designed to be exposed by custom GUIs, not by regedit. For example, to change the registry setting for the default program to open ".html" files with, you could* use regedit, but the Windows philosphy says there should be a dedicated UI for it instead (the "Default Programs" UI in this case).
'Strue. I do get frustrated with the registry model, although I've gotten good at dealing with it. On the other hand, custom GUIs help to improve Linux penetration, since many people are intimidated the command line or configuration file editing.
I like Win and Linux, Mac seems to me to combine the worst of both. Sure, it's personal. Application compatibility matters, Linux lags badly on video & audio editing tools.
When I first ran Linux around 1993, what really annoyed me were Emacs & vi. Sure, they're powerful, but extremely alienating to new users. DOS had Edit, or Windows 3 had Notepad. Ubuntu and like distros succeed because most people just want to drive without learning how to be a mechanic. Linux's biggest problem is people's perception that there's no 'standard' distribution and that they're going to have do an awful lot of icky maintenance.
I don't know whether or not it's fixed in more recent versions, but there are programs which get around this for you (by killing processes, or telling you which process is using the file/folder). E.g. Unlocker http://ccollomb.free.fr/unlocker/
Can someone with knowledge of the underlying OS mechanisms answer this for me: What is the deal with file access on Windows?
The Windows locking defaults when opening file-handles actually includes locking and has done so for such a long time that applications have come to depend on it.
If you check the Win32 API you will see applications have several options for requesting (or not requesting) file-locks when doing file-operations, but not using them, the defaults are to play it safe and ensure that no data is lost due to improper access.
Technologically speaking, this is kinda the same as you have in Linux, except that the defaults are the polar opposite.
Changing the defaults now would break havoc among multiple applications relying on the defaults and cause massive amounts of corrupted data. So Microsoft have kept them as is.
IOW: If you are having issues with file-locking, you need to blame the application-developer for not properly signalling what level and type of locking is actually required, because Windows will respect the locks set on file-handles. Maybe I'm the exception, but I consider that a good thing.
Apart from my server machines, I'm primarily a Windows user because I've always felt that Windows (esp. XP) saves you lot of time in configuring stuff as compared to *nix (not considering mac). Of course it restricts you a lot but with tools like GnuWin32 etc. you can create quite a powerful dev box.
Once I decided to install Ubuntu 8.04 in one of my machines but X didn't work due to some conflict with the graphics card. After lot of forum browsing and hair pulling, I finally went to #ubuntu on irc.freenode.net to beg for some help. I explained them the problem I was having and casually told them "I'm using Windows right now". The reply was "You mean, struggling with Windows?". Yeah right!
It's a combination of brown, peach (the colors from the original 'human skin tones' Ubuntu concept) and orange/red (newer, based on the Canonical logo).
I actually tried to organize a theme contest in conjuction with DeviantArt for 8.04, working with Ubuntu Artwork team, with prizes etc.
The first thing that happened was that winners were limited to working with the existing palette. The directive comes from Mark Shuttleworth and Ubuntu art are powerless to change it. The contest didn't go ahead.
However Mark seems to have recently changed his mind.
I remember listening to an interview with one of the original Doom developers. He said that one of the ways that they keep players interested in a game where players can go anywhere they want was by hiding secret power-ups and prizes all over a level. By rewarding a player who goes exploring, it communicates to the player that the developers want you to be interested, and that exploring can be fun.
Linux is sort of like that. You can't use all of it without exploring, so it's good to hide little customizations and treats all over so when somebody stumbles across it, they say "This is so cool!" and keep looking, learning. :)
"The desktop will have a designer's fingerprints all over it - we're now beginning the serious push to a new look. Brown has served us well but the Koala is considering other options. Come to UDS for a preview of the whole new look."
The default theme for PCLOS Gnome tempted me to try it and as a result convert from Ubuntu. The distro first has to look nice to get people interested and then work well to keep them on it.
Again, not an anti-MS person here, I support a large number of Windows clients, but it is not my OS of choice.
Maybe there are user interface tweaks that one can perform on Windows (do tell) but I can never seem to get it as minimal as running something like fluxbox on Linux (to say nothing of ratpoison, stumpwm, or xmonad). My very unscientific poll of the computer support staff at my office shows that there is fluxbox, openbox, and twm. My Windows usage is cursory, so maybe I'm totally wrong on this.
One option is LiteStep (http://en.wikipedia.org/wiki/LiteStep), which can make the Windows UI like AfterStep. A friend of mine uses it and claims it works very well.
Some other shell replacements exist, but I've never tried any of them.
Which, as the link suggests, is all about trying to make Windows tolerable for users of more advanced systems when said users are forced for whatever reason to hold their noses and use Windows.
I've been using the Kubuntu RC for a few days now. After the disappointment that kubuntu 8.04 was for me, im happy with this release. I've finally made the switch to KDE4, KDE3, you've served me well, but its time to go on, and KDE4 is finally good enough to not make me scream(as it used to do).
Ubuntu is a fantastic distribution of a magnificent OS. I just wish I could run design apps like Flash and Photoshop on it (and no, GIMP just doesn't do it for me).
Give VirtualBox (http://www.virtualbox.org/) a try. I run Photoshop CS4, Rosetta Stone v3, and backup my BlackBerry inside it with minimal problems. GIMP didn't do it for me either.
I use VirtualBox as well. And I also use WINE, but even though you can run Office 2K7 (and you can't go around it as a student, in particular as a business student) in WINE, which I do for short edits and to only view documents, I still mainly use Office on the emulated WinXP. Office fells snappier on the emulated machine, and especially because of Antialiasing of fonts text is easier to read. Overall it's just nicer to work with and it crashes a lot less... :-)
I've now been running Ubuntu for over a year for my own side business, for my consulting gig and as a student and I haven't had any issue I couldn't solve and one of the main solutions is to bow to the majority in some cases and use emulated XP or WINE.
Thank goodness in my engineering department matlab or excel are acceptable. Matlab works well in linux and is a much cheaper more powerful option than the office student edition paired with openoffice. (My school runs 2003 office though so thats a big difference in teachers required file formats).
The new ubuntu has better font smoothing by default however I don't think that transfers into wine sadly but at least you found a setup that works.
As long as you're paying for PS, check out Codeweavers' Crossover Linux. You can run native Windows programs under Linux (Crossover is a tweaked version of Wine).
If I remember the Wine Weekly News report correctly, I think Disney wanted it working on their Linux machines they used for rendering, and committed back all their fixes to Wine. So whatever that version was is probably pretty usable.
I find VMWare performs quite well for that stuff. Still, I don't want to use Windows even in that scenario. So I just use a Mac instead. I think if just Photoshop had a decent counterpart on *nix that'd be enough to get me to switch.
The main obstacle for migrating completely is a handful of the usual software titles, and tablet support. For tablet computers, like it or not, Vista is still the best, hands down.
But my jaw just dropped with the out-of-the-box support in Jaunty. It is marvellous. If multi monitor works just as smoothly, today might just be the day.
How well does VMWare work for you? My guest OSes are always as lean as possible with all graphics stripped out, and they are still laggy.
Speaking of Ubuntu usability, I just don't understand why they require me to pick a location closest to me in order to download Ubuntu. In the age of free IP geolocation databases they could at least fill this field with a best guess based on my IP or a least loaded server...
I tried the beta of this on my Lenovo Y510 with Intel graphics and the performance was horrible, due to the video drivers not being updated for the new Xorg 1.6 architecture.
On a positive note, my grandfather continues to enjoy his new Ubuntu 8.04.2 machine though, a welcome change to the Xubuntu 7.04 machine that was a 500Mhz P3 with 128MB RAM.
The netbook release says it's for Atom (though the compatibility list includes eee PC 701, which has a Celeron - by "Atom" they surely just mean intel-compatible).
Yeah, use whatever works best for you -- but using Windows these days feels like I'm on my Big Wheel when I was three years old. In a rubber room.