System Requirements

OpendTect system requirements

[Summary] Supported platforms

OpendTect needs good hardware with up-to-date drivers - especially for 3D graphics. You can run on:

  1. Intel/AMD under Linux (64 bits) or MS Windows (7 / 8 / 8.1 / 10 - 32 or 64 bits)
  2. Mac/Intel under OS X (10.6) and up

Hardware requirements


OpendTect requires a recent well-patched OpenGL installation. OpenGL drivers should be updated at least every half year to ensure optimal performance and compliance.

  • Intel/AMD: Recent nVidia and AMD (ATI) graphics cards/chipsets. Well behaved are nVidia GeForce 500, 600 and 700 series and AMD (ATI) Radeon 6xxx and 7xxx series.
  • Mac: Similar to the Intel/AMD platform.

Main stream and high-end GPUs within the series are recommended, since low-level GPUs keep showing poor performances through the generations.
Shading functionaly requires special GPU features, present in the cards listed above. Nevertheless, under Linux, only nVidia provides drivers capable of using the shading feature. If you can't see any colors on graphic elements, try disabling shading (Utilities-Look and Feel).


OpendTect will attempt to use 'shading' - this means that some calculations are done on the graphics card. Unfortunately, not all cards behave properly. Very old cards will be no problem because they report that shading is not supported. Very new cards usually support it correctly (e.g. all nVidia-based cards and chipsets). Some older cards do give problems. These report that they support shading, but they support only part or so badly that the system almost stops. There are two settings for the user to cope with this:

  • Do you want shading if the card reports that it is capable of it?
  • If so, do you also want it for volume rendering?

Some cards (Like some ATI cards) support shading well but things go bad for volume rendering. The default is:

  • Yes, use it if the card says it supports shading
  • No, do not use it for VR even if the card says it supports shading


  • If users get colorless inlines, time slices etc, they need to try disabling shading usage
  • If users want to try improved volume rendering, they can try enabling that.

The access to these options is in the user's menu 'Utilities-Settings-Look&Feel'.


OpendTect needs at least have 2 GB internal (RAM) memory available. Therefore we would recommend machines with at least 4GB RAM, as the operating systems (especially Windows) will need their share of this, too.

Please note: we are seeing an increasing number of crash reports from users attempting to use OpendTect v6 on machines with only 4GB RAM. With versions 4.6 & 5, machines of this grade appeared to be capable but we do see this difference when v6 is installed.

Therefore: depending on the size of the surveys we recommend 8-16 GB+. In special cases (big surveys or many data cubes) more memory may be required and data processing in larger surveys may require considerably more RAM.

(A rule of thumb is to have at least 10 times the displayed number of samples available. Thus, to be able to display 10 inlines with 2000 crosslines and 1000 samples per trace, you'll need a minimum of 200 million bytes of memory, i.e. 200 MB.)


For Linux and MS Windows, a modern Intel or AMD processor is required. Although OpendTect will run on 2 GHz processors or even less, we recommend 3+ GHz multi-core for a good working environment. Note that OpendTect heavily uses all processors if necessary.

Operating systems


A modern Linux distribution is required. Minimum:

Linux distributions should be LSB compliant. You can check this using the command lsb_release. This is particularly stringent for commercial plugins using the FlexNet system. There is documentation on installing license files for commercial plugins, and there is a page with background information.

For both SuSE and Red Hat-based distributions 64 bits releases are available. OpendTect is known to work under Debian, Ubuntu and other distributions, as well as earlier versions of the main distributions, too. Fedora usage is not recommended - although it may work it's the only distro that regularly fails to work in combination with OpendTect. This is probably because the graphics vendors do not support it well in terms of drivers.

Mac OS X

Minimum is Mac OS X (10.6) - thus Mac/Intel. Mac/PowerPC support is not available. A 3-button mouse is highly recommended.

Note: OpendTect v5 supports Mac OS X from 10.6 up to, and including, Yosemite. But El Capitan will not run OpendTect v5. OpendTect v6 supports all Mac OS X from 10.6 onwards, including El Capitan.

MS Windows

Windows 7, 8, 8.1 and 10 are supported. Windows needs to be updated with the latest updates from Microsoft.
Releases are available in both 32 and 64 bits.


If you have mega-surveys with Tera-bytes of data, and you want to do very advanced calculations, then you'll need the best you can get. What is best? The main idea is to minimize the bottlenecks.

  1. Graphics: use nVidia or maybe ATI-based cards. At least these manufacturers have good drivers for all cards. For nVidia, you may want to avoid the 'professional' series. This can be a waste of money (but may just give you that little bit extra you want, too). In doubt, buy the top gaming card(s) you can find.
  2. CPU: choose 64 bits. Many processors, high speeds. The more the better. OpendTect will automatically use multiple threads in many situations. It depends on the type of attribute, display, etc. but we put a lot of effort in getting time-consuming tasks multi-threaded. We are well aware that the number of processors will grow steadily.
  3. Memory: buy as much memory as you can afford (and that will fit in the computer). The big clients for example use nothing less than 64 GB. OpendTect doesn't have a lot of tricks to minimize memory consumption; we figure that memory gets cheaper by the day so we greedily use memory for our purposes (we try to not waste it, though).
  4. Disk storage: this is usually under-valued, but it's very often the crucial performance component. RAID can speed up disks considerably. If you can, work on local disks. I've seen many examples of the total performance being miserable just because the data needed to stream through (relatively) slow networks.

It's clear that the number of variables is huge, and that it's simply very difficult to predict whether a certain configuration will be good enough for your specific needs.

Please accept cookies to help us improve this website Is this OK? Yes No More on cookies »