lolo is a scalable, file-based storage service for research computing customers at UW. Offered through UW-IT, it was developed in collaboration with the eScience Institute to provide external storage for the Hyak cluster as well as addressing the growing storage needs of researchers campus wide. Two classes of storage are available:
- Collaboration Filesystem / Data Transfer Node
- Archive Filesystem
lolo is one element in a collection of centrally managed services supporting research computing at UW, including the Hyak scalable cluster computer and a high-speed networking hub connecting lolo, Hyak, campus, and the research Internet, including commercial cloud providers. Together, these allow users to create and operate sophisticated, data-driven workflows for any effort which requires them, including *omics research, nuclear and particle physics, and others.
The collaboration file service is entirely disk based and is intended for the sharing of research data among peers on and off campus. It can act as a high performance data transfer node (throughput of hundreds of MBs to Gigabytes per second, depending on file size, number of files, etc.), as well as a convenient means of accessing Hyak results from workstations in your lab. Data is stored in redundant disk arrays and accessed via a redundant pool of servers. The system is designed to expand easily to accommodate a petabyte or more.
The archive service is intended as a repository for data that you may rarely access but which you want to ensure is safe and convenient to access over the long term. While it is not intended for interactive or general purpose file storage, it does ensure your files are always available when you need them - simply open a file as you would with any other filesystem and within a minute or two the data begins quickly streaming back (retrieval of a 100GB file proceeds at 125MBs, including the initial 1 - 2 minute delay).
With the Archive, users write files to disk as they would with any other network filesystem. Within a day all files are automatically transferred from disk to tape in a primary campus datacenter. A second tape copy is created in another campus datacenter within an additional day or two. Recall of files is automatic and accomplished by simply opening and reading the file. Capacity can easily expand to several petabytes.
Both lolo filesystems are mounted directly on the Hyak login/data-mover nodes via multiple 10Gbs connections, ensuring very high transfer speeds They are also accessible from off campus via a 10Gbs circuit which bypasses the campus network firewall, allowing for large scale data transfers exceeding 500MBs in some cases. By early 2013, campus users will benefit from expedited access to lolo (and Hyak) via a dedicated research backbone.
The protocol for data access varies, depending on user location:
Capacity is purchased in 1TB increments for periods of three years. Rates are $1,570/TB/year for the Collaboration filesystem and $190.00/TB/year for the Archive. Customers are billed monthly. We anticipate these rates dropping substantially over time as capacity grows. Rates will be adjusted annually.
A service description for lolo appears in the UW-IT Service Catalog:
For assistance in understanding how lolo can help in your lab or department, please contact help [at] uw [dot] edu.
lolo is a word in Chinook Jargon, meaning “the whole thing” and “to carry”, depending on the application of accents. Written in lower case, lolo also represents a binary string (decimal 10).
Chinook Jargon is the trade language of the Pacific Northwest, incorporating terms from Chinook and Chehalis and other local languages, as well as French and English. We’ve chosen words from Chinook Jargon for the names of systems in the UW research cyberinfrastructure to emphasize their role in supporting the broad range of UW research users and our ties to our place between the mountains and ocean.
Also in... What We Do
Latest eScience News
Please help us support your research by including the following acknowledgment in publications to which we have contributed:
Supported in part by the University of Washington eScience Institute.