Locked History Actions

Diff for "InfolabCluster"

Differences between revisions 14 and 15
Revision 14 as of 2012-10-18 01:37:00
Size: 1984
Editor: akrevl
Comment:
Revision 15 as of 2012-10-18 01:37:27
Size: 1996
Editor: akrevl
Comment:
Deletions are marked like this. Additions are marked like this.
Line 38: Line 38:
 * [[InfolabClusterCompute|Using the compute cluster]]  * '''[[InfolabClusterCompute|Using the compute cluster]]'''
Line 60: Line 60:
 * [[InfolabClusterHadoop|Using the Hadoop cluster]]  * '''[[InfolabClusterHadoop|Using the Hadoop cluster]]'''

Infolab cluster

Beta warning

If Google can keep things in Beta, why can't we? So.. beware... Things might break. Please join the mailing list and report any glitches that you come across.

It was recently decided that having a split personality is not the best thing to have in a cluster as two different resource managers start to compete while not being aware of each other. That is why we have separated our cluster into a Compute cluster and a Hadoop cluster. Read on for more about the two.

Mailing list

There is a mailing list for all those interested in what is currently happening with the cluster and the configuration of the cluster:

Compute cluster

The compute cluster comes in handy whenever you need a lot of cores to get your job done. It is just like looking at the CPU and memory load of the other servers and then deciding which one to use for your job, only the job schedule will take care of looking at the CPU load for you and schedule the resources on a first come first serve basis (at least for the time being, queue priorities may change in the future).

Hardware

  • 1 head node: iln1
  • 2 development nodes: ild1, ild2
  • 28 compute nodes: iln1 - iln28
    • 896 CPU cores
    • 1792 GB RAM

Software

  • Torque resource manager
  • MAUI job scheduler
  • CentOS 6.3

Resources

Hadoop cluster

If you want to run map/reduce jobs then this is the cluster for you.

Hardware

  • 1 head node: iln29
  • 7 compute nodes: iln30 - iln36
    • 224 CPU cores
    • 448 GB RAM

Software

  • Apache Hadoop 1.0.3
  • Pig
  • Rhipe
  • CentOS 6.3

Resources