Hadoop and Spark request
To use Hadoop and Spark on Bridges, please complete this form. We will contact you within one business day once we receive your request.
Programming: Python, Java and Scala are all available.
Framework: Request Hadoop, Spark, or both. If you will use Spark, you can also use HDFS if you choose.
Nodes: Bridges supports Hadoop and Spark on its RSM (128GB) nodes. You will have the use of all 28 cores on the nodes you request. You will also be allocated all the local disk associated with those nodes.
Time: Request the amount of time you need in hours. Note that 28 SUs will be deducted from your allocation for each hour (1 SU = 1 core-hour).
Specific requests: Be sure to describe any special needs you have, such as a specific Hadoop implementation.
Questions: Contact firstname.lastname@example.org with any questions.