National Center for Supercomputing Applications
The National Center for Supercomputing Applications has evolved into a scientific research center built around a national services facility. NCSA is developing and implementing a national strategy to create, use, and transfer advanced computing and communication tools and information technologies. These advances serve the center's diverse set of constituencies in the areas of science, engineering, education, and business.
When the center opened to the research community in 1986, researchers using the supercomputers were from the traditional disciplines of physics, chemistry, materials, and astrophysics. But researchers in other disciplines quickly realized they, too, could transform their fields by using supercomputing. Today's growth areas include commercial and financial data mining and networked World Wide Web multimedia asset management.
NCSA works closely with the computer science research community to bring users the most advanced methods in high-performance scalable computing. The center currently maintains these types of high performance systems: HP-Convex Exemplar S-Class, Silicon Graphics CRAY Origin2000, Silicon Graphics POWER CHALLENGEarray, HP-CONVEX Exemplar SPP-1200, and Thinking Machines CM-5.
Scientists and researchers associated with NCSA provide never-ending challenges in all areas of computational research. Their need to solve complex problems, to comprehend billions of numbers, to collaborate, and to access information have led to advances in cyberspace technologies and virtual reality.
Since its beginning NCSA has developed desktop software to help users navigate cyberspace. In 1992 the release of the network browser NCSA Mosaic provided the basis of a technology that is generating the "gold rush" on the Internet. NCSA is now creating a powerful desktop-based collaborative software environment that will help eliminate distance barriers for virtual teams.
- 1 Traversing cyberspace
- 1.1 From Mosaic to Habanero -- collaborative environments in cyberspace
- 1.2 The evolving Internet
- 2 Software
- 3 See Also
In the course of developing hardware and software tools for science, NCSA has created many tools for use in cyberspace, from NCSA Telnet, NCSA Image, and NCSA Collage, to the widely recognized NCSA Mosaic. Released across all platforms in 1994, Mosaic opened the Internet to millions of people. With its hypertext-driven interface, you could, for the first time, easily browse the thousands of documents proliferating on the World Wide Web. A familiar click of the mouse button retrieved text, images, data -- even movie and audio clips -- residing on computers around the world and quickly displayed them on a person's desktop computer.
These features proved so popular that within a year almost a dozen commercial versions of the browser hit the market. Within 18 months of Mosaic's release, traffic on the Web -- the graphics-intensive portion of the Internet -- increased 100,000-fold. Within a year, derivatives of Mosaic were found in more than 80 products. NCSA's Web server software (the companion of Mosaic client software) resides on at least half of all public machines that serve the Web.
PC Magazine declared, "It's clear that Mosaic started a revolution." Mosaic continues to be enhanced, but the center is moving on. Now NCSA is bringing the power of high-performance computing to the Web and reinventing collaborative tools with the power of the Internet behind them. These new developments -- many of which are described below -- may eventually prove as revolutionary as Mosaic.
"Cyberspace is a place where people communicate over unknown distances and where distances don't matter at all." --Gerry Labetz, winner of NCSA's 1995 Industrial Challenge Award in recognition of his new digital cellular telephone system designed for Motorola Corporation. As with the Internet, it is altering our perception of place and distance.
From Mosaic to Habanero -- collaborative environments in cyberspace
Building upon the Web technology of Mosaic is Habanero. Not a browser, Habanero is a new Java(tm)-based framework that will harness the power of the Web for collaboration. Its developers think of it as the infrastructure of a kind of global research laboratory, or better yet, a virtual institute for scientists. Eventually it will support all the technologies scientists will need for collaboration: real-time data, 3D visualizations, email, bulletin boards, and global annotation.
Habanero's combination of synchronous (at the same time) and asynchronous (one at a time) communication capabilities is the key to its versatility. Scientists will be able to activate such real-time synchronous capabilities as the electronic whiteboard (where notes jotted on one screen are similarly jotted on the screens of other session participants). At the same time, they can use asynchronous capabilities for chat sessions or email, to annotate files, or to view text or image files. A more futuristic feature will include Internet agents -- electronic assistants that scientists can program to sit in on Habanero sessions in their absence. The agent may record the session or beep if talk turns to a topic they're interested in. How about a 3D representation of themselves -- it's a possibility.
Described below are some of the new tools that will be a part of Habanero.
VRML -- Bringing virtual reality to the Web
Many of the colorful graphics that now punctuate the Web are going 3D. A new file format called VRML, for virtual reality modeling language, encodes these large graphic files into a compact format so they can be transported over a network. Once at a scientist's desktop, the images can be rotated, manipulated, or walked through as if they were real 3D spaces. NCSA is developing a publicly available "viewer" that can be downloaded onto any client machine so that anyone with a Web browser can display these images. Eventually, the center will incorporate a mobile Java-like code into the language so that sound and other more advanced interactive capabilities can be executed at the desktop. When that happens, the 3D images may even walk and talk. It will be as close to a face-to-face meeting two scientists can have without being in the same place.
HyperNews -- bulletin boards but better
Similar to news groups, HyperNews supports conferencing on the World Wide Web. Readers can add responses to existing WWW pages, which are displayed to a set depth, so that the participants can carry on a threaded style of conference. NCSA is developing many new features for Hypernews: new mechanisms for notifying participants of responses (currently they are sent an email); various options for organizing the forum; security features; and annotation capabilities.
Security and privacy on the Internet
Security and privacy features being incorporated into new releases of NCSA Web browsers will help safely transmit information among the estimated 30,000 networks and 2.2 million computers on the Internet. Based on public key encryption technology and the protocols such as S-HTTP, the application converts normal text into a scrambled message that can be decrypted only by someone possessing the correct key, or encryption code. Whereas communicating over the Internet now is like sending postcards, security features will offer the virtual equivalent of the sealed envelope. It assures a message's authenticity, integrity, privacy, and nonreputability -- all the essentials for commerce and the growth of other vital communications over the Internet. Security-related features will include Message Digest Authentication and Kerberos Authentication.
Hierarchical Data Format (HDF) -- simplifying data sharing for science
When NASA officials were looking for software to handle the estimated 2 terabytes of global climate change data the agency's brigade of satellites would soon be beaming to Earth each day, they chose NCSA's Hierarchical Data Format (HDF). What swayed them was its power, flexibility, and the broad range of NCSA supporting software, such as its Web software and data visualization tools.
The HDF software library automatically converts data from almost any computer system into a self-describing, machine-independent form that can be read by programs running on other machines. HDF's capacity for self-description is particularly important since this feature, which is common in commercial software, is difficult to include and hence is unavailable for most of the highly-specialized programs written by scientists. Self-description places crucial descriptive information about a file within the file. Without it, sharing files is like trying to develop a role of film without knowing its speed, size, or even whether it is black and white or color.
HDF encodes most of the data structures common to science -- raster images, multidimensional gridded data, finite-element data, multivariate datasets, polygonal mesh data, graphs, color palettes, and text. These data objects can be saved singly or combined in one HDF file, which later can be easily annotated and expanded by adding other objects, then easily brought into collaborative sessions.
The evolving Internet
World Wide Web servers to repositories
Until recently designers of the Internet focused on client and server protocols -- the software rules and regulations that have made it possible for you to sit at your computer (the client) and click to video clips, images, and texts residing on computers elsewhere (the servers). Now, however, the emphasis is on getting servers talking to servers. By enabling these machines to communicate between each other, you can move beyond fetching and retrieving information on the Internet toward actually managing the information space. Several initiatives at NCSA hint at how these new layers of communication will reshape the Internet and how we can use it.
Designing the scalable Web server
Web servers at NCSA are busy machines. Every week they receive 4 million requests for files; that's almost 24,000 "hits" every hour. During peak hours, the number of requests climbs to 60,000, or more than 30 every second -- for hours on end. No computer was designed to take that many hits on a sustained basis. To circumvent server overload, researchers at NCSA have designed a scalable server consisting of several independent servers linked with a distributed file system so that requests could be doled out to the different machines in a round-robin fashion. The design is not revolutionary, but its innovative configuration is being implemented widely by other organizations that are trying to keep pace with the Web. One well-known imitator is the White House.
The minutes now required to download a 30-second video clip are plodding in a technology defined by speed. That's why researchers at NCSA are designing a new kind of video server that will speed up transmission as well as accommodate whole new video features, like hyperlinks and searchable databases. With this next- generation server, videos won't be downloaded in their entirety by a browser that then plays them back. Instead, the video data will be "streamed" so that viewing can begin immediately.
For the system to work, the browsers will require enhancement, but the most significant developments will take place in the server. The researchers now are confronting hierarchical storage problems similar to those with mass storage. Since video consumes large amounts of storage -- and storage space that can be accessed quickly is expensive -- the video data has to automatically divide itself among fast storage areas and bulk storage while still providing the browsers with video quality transmissions. Still to be figured out is how to index and store the terabytes of data the videos will consume. The Digital Library Initiative described below may yield some ideas.
Digital Library Initiative -- connecting information and computers
A complaint often heard about the Internet is the difficulty in finding information. You can while away hours hopping from weather pages to the CIA and on to MGM studios before you stumble upon the study of Monarch butterflies you set out to find. Today searching is tedious but tomorrow it will be automatic. One reason is the advanced searching and linking capabilities being developed as part of the Digital Libraries Initiative (DLI). DLI is an ambitious project funded by the National Science Foundation, the Advanced Research Projects Agency, and NASA to organize the Internet. Projects at six universities will create the advanced protocols for searching, indexing, and publishing that will do for electronic information space what the Dewey Decimal System did for libraries.
Researchers at NCSA are working with the University of Illinois Library on an enhanced Web searching mechanism for scientific and technical journals. Their testbed will contain 100,000 journal articles, complete with pictures, graphics, and citations. It will gather data across the Big Ten universities for four years.
Social scientists and librarians are applying their understanding of human behavior to the design of the searching mechanism so it will be compatible with the ways in which people narrow their inquiry. When completed, the six projects will be linked through a server-to-server framework so that, for instance, a search initiated on one server will trigger searches on other servers; or a document published on one server will be registered and indexed to other documents with the push of a publish button.