Difference between revisions of "Bristol"
From GridPP Wiki
								
												
				| Line 1: | Line 1: | ||
==Other Resources==  | ==Other Resources==  | ||
| − | *[http://www.  | + | *[http://www.bristol.ac.uk/physics/research/particle/ Bristol Physics Department, Particle Physics Research Group]  | 
== '''Bristol Grid Resources''' ==  | == '''Bristol Grid Resources''' ==  | ||
| Line 14: | Line 14: | ||
=== Storage ===  | === Storage ===  | ||
| − | * 2  | + | * 960 TB as HDFS (replication factor 2) with DMLite as frontend (xrootd, GridFTP)  | 
| − | + | ||
| − | + | ||
| − | |||
== '''Monitoring and Accounting''' ==  | == '''Monitoring and Accounting''' ==  | ||
Revision as of 21:02, 18 April 2016
Contents
Other Resources
Bristol Grid Resources
- 16 PP cores dedicated to LCG Grid (increase of 700% over Oct 2005 when I started)
 - 81 jobslots on SL4 HPC
 - PP WN CPU Model: Intel Xeon E5405 (2.0GHz), Spec2000 = 2100
 - HPC WN CPU Model: AMD 2218 (2.6GHz), Spec2000 = 1745
 - SpecFloat: No idea
 - RAM: 2 GB/core
 - Supported VOs: alice atlas cms dteam gridpp lhcb ops vo.southgrid.ac.uk zeus
 
Storage
- 960 TB as HDFS (replication factor 2) with DMLite as frontend (xrootd, GridFTP)
 
Monitoring and Accounting
Networking
Answers to GridPP 'Ten Easy Network Questions': Bristol (10 Questions)
Ongoing Work and Plans
- New LCG Grid kickstart server - in production, great improvement
 - Work to use Bristol HPC Cluster with TAR WN
 - Jon Wakelin's StoRM work:
 - 24hr Transfer Tests