Back towards the dawn of OS X, there seemed to be a great deal of hubbub, at least in the Mac world (I was nowhere near scientific computing at the time) about the Mac OS as a platform for scientific computing and HPC applications. XGrid came out of the box, Virginia Tech had their fancy Mac-based computing cluster, Stanford was doing cool things, etc. More recently however, things have been quiet. The Macresearch.org site is essentially a ghost town filled with spammer zombies, the XServe is dead, and an awful lot of the marketing literature and the like seems to even be from the pre-Intel processor area.
But XGrid is still there, the whole.nix OS underpinning is there, and the platform seems to have decent support among Python, R, and some of the newer languages. So, from people who know more about this than I do.how fares OS X? Are they viable client-side computers for scientific computing? Is using them as a server/cluster/etc. Through XGrid or something like it simply a novelty application? @dmckee: Yeah, I could see that. I think many people will agree with you.
For me, the issue wasn't so straightforward because I got tired of hardware failures, and even then, I still run Linux in a virtual machine. I've encountered people who like their workflow to be mostly OS X, so I think it's a reasonable question for a niche community. I also feel like Computational Science could use a few more questions, and responses from multiple perspectives would be helpful in giving people an idea of what tools we use in scientific computing. – Jan 3 '12 at 0:00.
I can't comment on the server side of things. On the client side, at the one computational science meeting I go to every year, the proportion of Mac users seems to have increased. I switched to a Mac because I got tired of dealing with my school-supplied Dell laptop failing at the drop of a hat. I switched to Macs for the hardware, primarily, since Consumer Reports rated them highly in terms of durability. I don't think that Macs are good for scientific computing unless you run Linux on them. The Linux support for the hardware tends to lag; usually, it's the wireless card that isn't supported (whenever they change it in a new model). If you're willing to accept the resource penalty that comes with running a virtual machine, it's an attractive option (and one that I personally use).
Macs require you to install a lot of libraries and software packages before you can do serious scientific computing. Anything that has a Mac installer is easy to manage, so if you do most of your development work with Matlab, Mathematica, Maple, Python, etc., it's easy to install and run that software on OS X natively. It's harder to track down hard core numerical software that has a Mac installer (think things like PETSc or CLAWPACK). Package managers like and can help the situation if you want to use OS X only.
You'll also have to compile a lot of packages from source. If you want to run your code anywhere else, you'll have to watch out for compatibility issues.
Since Linux enjoys widespread usage in scientific computing, it's easier from a portability standpoint to develop code in Linux. I've heard anecdotally from highly opinionated friends that setting up a development environment on a Mac is a huge pain in the ass, others have said it's not so bad. Your mileage may vary.
I down voted: How is installing anything for scientific computing on the Mac more complicated than on Linux? Sure if you start from Linux CAE most of it is builtin but for most other distro, you have to download packages and/or build from sources. Which you might still want to anyway to tweak the libraries for your specific needs and/or maximum performance. That being said, Mac OS will have a hard time being adopted on a large scale at universities as long as they don't license the OS independent of the hardware (not that they should).
Not sure how you can justify the extra cost. – Jan 3 '12 at 1:49. FrenchKheldar: There are a lot of ancillary things that have to be installed on a Mac (I write this on a Macbook Pro) when building software - a lot of libraries and tools - that are just an apt-get or yum install away on a linux box, to say nothing of bigger items like newer compilers, a newer python, etc. That's why there's ports, and homebrew, but these things still don't always install cleanly, and not all packages are in both. Depending on what you want in your dev environment, it can be, and in fact is, harder on a Mac. OTOH, you get Xcode.
– user389 Jan 3 '12 at 2:05. It's considerably easier to install something from a package manager than it is to compile the same code from scratch. At best, it's a simple./configure && make && make install, but at worst, it can be a maze of sorting out scripts, flags, and library locations.
Installing PETSc 3.2 from a Debian package took me 5 seconds (I even did this at the behest of some of the developers); installing it from source in Linux took all day with the various options. The point was that packages and installers don't necessarily exist for the Mac when they do for Linux (and they make Linux easier to use). – Jan 3 '12 at 2:13. There's also the huge annoyance of 'only stable' hanging in the air with Macs, at least I've had to fight with it several times. A good example I've personally had is a GCC bug that made an app just downright crash & burn if you had OpenMP pragmas inside code that gets called from a pthread. I actually ended up just using & supporting ICC since it was available, because it was just easier than changing the GCC on that Mac. There's a bunch of other libraries I've seen on Mac that have prehistoric versions (well, at least a couple of years of age).
– Jan 3 '12 at 5:50. @Geoff gives a good answer, but I think it's worth providing an alternative perspective. I do everything on Macs - in OS X, not a Linux VM - including lots of scientific code development. I mostly work in Fortran and Python. For me, the convenience of. being able to do all my work in one OS and. almost never deal with hardware failures or driver issues is worth the cost of Mac-specific headaches.
The three main headaches are:. Lack of an OS-standard package manager.
Once upon a time I used Fink, but eventually it led to more headaches and it's now obsolete. I've heard good things about Macports and Homebrew, but my experience with Fink convinced me to just 'roll my own'.
Some of the built-in software is very outdated. Particularly, Python and gcc. This means that you need to install your own updated versions, which can be a hassle.
Apple does not include a Fortran compiler! It seems to me that Apple is paying less and less attention to their Unix-based power users. Meanwhile, Linux keeps improving. Eventually I will probably be pushed back into Linux. But I will keep my Macbook until somebody else learns how to make decent batteries. I would argue that the Mac is a better environment for computational scientists than it is for computational science.
I would not want to use Macs in a commodity computing environment; the hardware is, relatively speaking, way too expensive for that. It can be a pain to get the software environment up to match the conditions needed for a particular package, but usually once you've figured it out the first time, it's a lot easier to maintain than a comparable Windows installation. (And, depending on the package managers, it can be as easy as Linux. Use of OS X in HPC and scientific computing is low and it has to do a lot with the pros and cons of OS X w.r.t. The alternative (Linux) OS X Pros:.
Polished UI; still.nix. Desktop/Design app(s) such as MS Office, Adobe programs well supported.
Multimedia very well supported. Some people like the Apple ecosystem (iPhone, iTunes etc.) OS X Cons:. Runs on expensive hardware and not everyone likes Macbooks, specially people used to Thinkpads (keyboard+trackpoint). Cannot upgrade hardware (e.g.
If you want to try the latest NVIDIA card with your CUDA app) on desktop/cluster. Bloated GUI that cannot be customized (in Linux you can use a minimalist window manager). Package mgmt in Macports/Fink is sub par compared to Linux (Debian) distributions. Most packages are not even actively maintained or are orphaned. Some useful tools/programs traditionally did not run or still dont run on OS X. Sun Studio still doesnt work.
Valgrind only started working recently and not all features are supported. Intel compilers have also been available in recent years. Apple doesnt even package a Fortran compiler and you have to rely on 3rd party (mostly individuals) to build binaries that work only on certain OS X versions (which the individual has).
Support is rare or non-existant in such cases. Commericial scientific apps (ABAQUS, ANSYS, FLUENT and many more in industries such as oil/finance/engg etc.) do not run (natively) on OS X Linux (Debian) Pros:. First class package management i.e., installation of compilers, most numerical/scientific libraries etc. I have been using nothing but Macs on the desktop (and laptop) for many years, doing scientific computing and scientific software development among other things. As others have pointed out, the quality of the hardware, the high quality of much Mac-specific software, and the ability to handle Word and Excel when necessary, make the Mac a very nice platform for daily use.
I have also been running a Mac-based compute cluster as an experiment for a while. It's an experiment I am not tempted to do again. Compared to a Linux cluster, I don't see any significant advantages, other than ease of software installation if you have Macs on the desktop anyway (just install the same stuff). The disadvantages stand out clearly, most of all the lack of proper multiuser GUI support. On a Mac, one machine equals one screen and at most one logged-in user. That makes GUI-based tools a pain to use.
Compared to that, even plain X-windows under Linux is a joy to use, and then there is VNC and NoMachine NX to do even better. Yes, I know the Mac supports X windows, but most GUI programs for the Mac use the native interface.