UK HomeAcademicsAthleticsMedical CenterResearchSite IndexSearch UK

 

Photo of projected image seemingly passing through the shadow of a man standing in front of itThe Metaverse:
Projecting Us Into the Future

A forest fire rages and UK mechanical engineer Jim McDonough is right in the thick of it.

But his skin doesn't feel the flames and his lungs aren't choked by the smoke—this is a "virtual" fire. He's surrounded by a life-size image of the blaze, projected on the walls of an ordinary room. As he is "immersed" by the fire, he observes how it moves and discovers how to control it.

While this may seem a bit like science fiction, and in fact that's where the concept came from, this new "virtual reality" is science fact. First described in Neil Stephenson's 1992 sci-fi novel Snow Crash, the Metaverse is a digital world that mimics reality and a term that captures the sense and spirit of innovative projects in networking, multimedia and computer vision under way in the Laboratory for Advanced Networking (LAN) in the James F. Hardymon Building at the University of Kentucky.

"The collaborations going on here among computer science and electrical engineering faculty and students are best described by the term 'Metaverse,' which is the over-arching theme that ties all of our research interests together," says Jim Griffioen, LAN director.

Photo of computer graphics cardGraphics cards, which help the computer display graphics, consist primarily of a special processor called the accelerator, which handles display functions, and RAM (memory) that stores display information.

To faculty tackling research in areas such as network protocols, distributed computing systems, and visualization, the meaning of the Metaverse can be boiled down to two words: networked multimedia. "People enter these virtual worlds with pixels [image building blocks] painted all around them, and we need to provide the communication infrastructure so you can effectively interact with other people in similar environments, just as if they were sitting right across from you," says Griffioen.

With applications ranging from teleconferencing, scientific modeling, virtual art exhibits, digital library collections, and personal computing, the Metaverse will do nothing less than change the way we see, play, learn, and work. One day each of us may enter the Metaverse as easily and as often as we now tap into the Internet.

Pixel Immersion

Application-level research at UK focused on this pixel-painted world (a.k.a. immersive display environment) is spearheaded by Brent Seales and Christopher Jaynes, and funded by the National Science Foundation. "I think of the Metaverse as creating a world for our senses that's pretty seamless in terms of integrating real video, computer models, and your reaction to the environment in a very natural way," says Seales, an associate professor of computer science. "The Metaverse is an interface to interact with the way we represent other people in video, as well as with 3-D models."

The first and still prevalent paradigm for such visualization, called a CAVE (Cave Automatic Virtual Environment), surrounds the user with rear-projection TVs. "Essentially you walk into a room that consists of four walls that are just gigantic back-projection TVs hooked to a computer," Griffioen explains. "This technology is very useful for scientific visualization. It allows users to get into their data and look around." But the big drawback is money. "Back-projection is incredibly expensive and immobile. You have to do a lot of renovation to your space," Griffioen says. "And more importantly, CAVEs are hard to maintain. If these TVs get out of sync or stop working, it's a massive undertaking to get them fixed."

Computer drawing of an immersive display roomA computer drawing of an immersive display room—where all four walls are covered with projected images

The solution: a new paradigm featuring low-cost, self-configuring visualization environments that are networked together and can be set up, practically on-the-fly, at any location. Seales and Jaynes describe several types of Metaverse "portals"—a room where all four walls are covered with projected images, a classroom where the front wall is a gigantic projected display, or a trade-show kiosk where images are being projected on temporary screens. But all of these options rely on the same relatively inexpensive technology—video projectors, graphics and network cards, and off-the-shelf PCs.

Computer drawing of a classroom with projected displaysA classroom utilizing projected displays

"Three years ago this projector cost $8,000," says Seales, pointing to the projector mounted above the desk in his office. "Right now you can get a high-end projector with the same capabilities for $3,500. You can get a good projector for $2,000, and the price just keeps dropping."

Computer drawing of a trade-show kioskA trade-show kiosk

And thanks to game technology, the graphics cards needed to run the Metaverse environment cost only $200. "You can think about the graphics card as actually being a computer in the form of a card," says Seales. "We've harnessed the capabilities of those cards to do the computations to make the display look seamless. Because the market is driven by all those teenagers buying games, we can now buy the latest and greatest card for $200 and do stuff with it that five years ago you couldn't do without a complete workstation. So with a PC (plus graphics card and network card) per projector, network communication, and some clever algorithms, you can launch the Metaverse."

But Seales and Jaynes didn't stop there. They added a camera that tracks the user. "The idea is that we'll have a virtual interface in the Metaverse. There won't be a keyboard or mouse anymore. I will convert my office into an 'intelligent' display where cameras will be watching what I'm doing. Say, today's newspaper is being projected on the surface of my desk. If I make the motion of turning a page, I should see the image change, just as if I were turning the page of an actual paper," says Jaynes, a computer science assistant professor. He labels this work "augmented reality" and while it may sound a little "out there," it would mean direct interaction with the Metaverse environment. For example, a professor using a projected display for a lecture could advance his PowerPoint presentation by "pushing" a virtual button.

Seales and Jaynes are testing these concepts in the CoRE (Collaborative Rendering Environment) lab in the Hardymon Building. This immersive display lab is currently stocked with six high-resolution projectors, six PCs with graphics cards, two infrared tracking devices, a network hub and six network cards, four cameras, and a wireless keyboard and mouse.

"Our goal in the CoRE lab is to have a projector covering every surface," Jaynes says. "We plan to hang at least 15 projectors from the ceiling."

More projectors mean higher resolutions. "Right now we can get about 10 mega-pixels up on the wall for you, which is about twice as many as a CAVE can do. But there's no reason, in this framework, that we can't scale that up to 100 or 1,000 mega-pixels, which equals a giga-pixel," says Seales.

The question of how you can blend the output of many projectors into one image is where Seales's and Jaynes's training in computer vision (understanding how a computer processes visual information) comes in. Seales earned his master's and Ph.D. from the University of Wisconsin-Madison, and wrote his thesis on machine vision. During a leave of absence from UK, he spent the 1991-92 academic year as a postdoctoral scholar at INRIA, the French National Institute for Research in Computer Science and Control.

Jaynes's interest in computer vision started in high school at a job with Hewlett-Packard. "I worked with a robot they were trying to use to deliver mail automatically," he says. Jaynes went on to undergraduate study at the University of Utah, a school well known in the computer vision and graphics field, and to graduate school at the University of Massachusetts.

Photo of Brent Seales and Christopher JaynesBrent Seales (left) and Christopher Jaynes are leading research into the applications of front-projection technology for immersive viewing experiences.

"I have to say that when I came here, UK was not really poised to be a player in what I was interested in doing," says Seales. "But through a number of grants, and initiatives from UK and outside the university, like the Research Challenge Trust Fund, UK is now a player."

"When I graduated from UMass in late 1998, I wanted to go to a school with a lot of resources, but I wasn't interested in going somewhere that was already famous for computer vision," Jaynes says. "I wanted to go to an up-and-coming school where I could set my own research agenda." The problem with going to a well-known program, says Jaynes, is that young faculty often end up working under a "star" as a second-tier researcher, sometimes forever.

"I wanted to do exactly what we're doing now—define a difficult problem, like the Metaverse, something a little bit 'out there,' and work together to come up with solutions so we can actually reach something exciting."

The Fundamentals and the Future

At any given time, Seales and Jaynes have a team of 20 to 30 Ph.D., master's and undergraduate students working on Metaverse projects. In fact, Steve Webb, now a Ph.D. student and one of the first undergrads Jaynes recruited, was a key player, Jaynes says, in the Metaverse shadow-removal project.

Jaynes came to UK with the promise of prime real estate in the new Hardymon Building, but with completion still a few years away and no space to start the projection research he was interested in, he enlisted the help of Joe Fink, director of ASTeCC, the Advanced Science and Technology Commercialization Center. Fink arranged for Jaynes to use a third-floor office in the ASTeCC building.

"So Steve and I mount two projectors on the wall of this little office. We sit down, and the shadow-removal project was born because the room is so tiny that no matter where he sits Steve's head is blocking the projector," Jaynes recalls. "And Steve says, 'We should know where my shadow is on the wall and correct for it with the second projector.' It was a great idea.

"By providing us space, Joe helped kick-start this research," Jaynes says. "Without his help we wouldn't have had displays ready for opening day of the Hardymon Building last September."

The idea behind the project was to exploit projector overlap. "When you have double or triple illumination (every point illuminated by multiple projectors), you can remove a shadow because the camera detects when a projector is blocked and the computer makes the other projectors kick-in more light," says Seales. The computer creates a mask, which defines the area of the image that is being obscured, and tells the other projectors to blast in those missing pixels to complete the image.

The technical paper on shadow removal, authored by Jaynes, Webb, Seales, and Michael Brown, one of Seales's Ph.D. students, will be published as part of the IEEE (Institute of Electrical and Electronics Engineers) Visualization 2001 conference in San Diego this fall.

"My research philosophy has always been: approach things theoretically, understand the mathematics, and solve the problem. But then there's always an extra step—figure out how you can make it work in hardware for low cost," Jaynes says. "Our shadow-removal tool runs in hardware at a rate of 30 frames per second—you can wave your arm and it automatically corrects. And it runs on hardware that anyone can afford; in fact, your home PC will probably run it."

Provided you are running Linux. An alternative to Windows or Mac created by University of Helsinki student Linus Torvalds in 1991, Linux is an operating system popular among computer scientists because it is "open source." "That means I can go into the operating system, look at the source code, and change it to make it behave how I need it to behave," Jaynes says. "And when you're talking about research, no one is doing the same things we are, so we have to be able to tweak the operating system."

Sequence of photos shows how UK's program automates calibration of multiple projectorsThis sequence of photos of an aquarium simulation (with swimming whales and sharks) illustrates the auto-calibration program developed at UK.

A) Three projectors are combined to produce a single display.

B) Michael Brown changes the position of the middle projector.

C) Because the projector has been tilted, there are now "ghosts"—you see duplicate fish as they cross into the area projected by the middle projector.

D) The camera watching the display recognizes a projector has moved, and the computer begins a recalibration sequence. The sequence begins with a series of dots that define the area of the image each projector is contributing, and ends with this grid pattern that blends the three projections together.

E) The new, fixed image shows the fish swimming across the seamless display.

Serious Play

Writing the algorithms to run the Metaverse involves some serious math smarts. "Computer vision's a total mixed bag. There are very theoretical components, there's deep mathematics, and you have to be able to do those two things plus write really good code to make the stuff work in the end. If someone comes to me and they know a little bit about linear algebra, I say they can usually pick up the projective geometry stuff. People say my lab is too hard and some students avoid me," Jaynes says, smiling. "But I have outstanding students right now.

"Our students recently set up a four-projector display, 11 by 11 feet, and they're running Quake [a popular arena combat game] on top of our software that blends it all together into one huge image," Jaynes says. "What's good about that is it proves our system actually works with out-of-the-box software."

Michael Brown implemented the Quake system. "This combination of self-configuration and OpenGL [a graphics language] portability is one of the major themes in Michael's thesis work," says Seales. Brown earned his Ph.D. from UK in July and accepted an assistant professorship at the Hong Kong University of Science and Technology.

The "synchronized rendering algorithm"—the mathematical magic behind transforming the output of several projectors into a seamless image—works in a pretty simple way, as Jaynes explains.

"Each projector says, 'It's my turn to draw the picture. I'm ready to draw it.' They send that message over the network, and when every projector is ready, the server says, 'Draw.' Just last week my students got about 10 e-mails asking to download our algorithm for Linux. These are probably just kids that want to do the Quake thing, but word is starting to leak out."

Jaynes is pursuing intellectual property protection, which may ultimately lead to patents, for a number of ideas, including shadow removal, auto-keystoning, and auto-calibration.

If you tilt a projector up, the sides of the image are no longer parallel and form a shape that resembles a keystone in an arch. Matt Steel, a Ph.D. student working with Jaynes, is writing algorithms to automatically fix this problem, even as you reposition the projector. "The university could benefit by patenting these ideas for use in the home-theater market," Jaynes says. "I think eventually we're going to have a projector company in here wanting to make a new product, a projector with a built-in camera. You set it on a table, hit a button, and it starts to monitor things like shadows and changes in lighting."

As projector size and price decrease, Seales says he envisions projectors as next-generation track lighting. "I think people are going to pepper rooms with projectors just like normal lighting, only it will be lighting that's completely controllable by computer," he says. But before that happens, this technology will most likely eliminate the traditional desktop monitor. Seales and Jaynes are already using projectors instead of monitors in their offices. They can project their "screen" on any wall at much higher resolutions than most monitors, freeing-up desktop space, and with a wireless keyboard and mouse, they're not constrained by cords.

On a larger scale, Seales envisions libraries setting aside a room where users can view digital collections through the Metaverse. "Imagine if you went to an art museum and you had to look at each piece of art through a two-inch hole," Seales says. "Essentially that's what the digital library is now—what you see is limited by the size and resolution of the monitor. We want each library to have an immersive archive room where you can see the artifact at high-resolution. These rooms would be networked so that you'll be able to see collections from other libraries at the same time as you look at local collections."

Photo of fire-damaged, 11th-century manuscriptBrent Seales and his research team are using digital techniques to preserve a set of fire-damaged, 11th-century manuscripts.

One of Seales's current projects involves digitally capturing a set of manuscripts written by an 11th-century monk on the lives of the saints, part of the British Library's Cottonian Collection which was damaged by fire in the 18th century. (This work is part of the Digital Atheneum project at UK and supported by NSF.) "Dr. Kevin Kiernan [UK English professor] provides the scholarship for the humanities side of the project, while my team and I provide the technical innovation," Seales says. "Because of the damage, these documents are inaccessible to scholars, unless they travel to the British Library. Once we work on ways to digitize the manuscripts, the Metaverse would allow scholars all over the world to explore these works."

In a similar way, Seales and Jaynes see the art world embracing this technology. Imagine accessing a collection at the New York Metropolitan Museum of Art and viewing it on your wall at home. Doreen Maloney, an assistant professor in the UK College of Fine Arts is interested in using the Metaverse to make art more accessible. "Doreen and I have been meeting to devise a strategy to have the Metaverse CoRE Lab as a stop on next year's Gallery Hop," Jaynes says. "We'll have hors d'oeuvres and people will be able to view her artwork on the walls. This would be really good for UK. It's a high-profile event, and it would bring the technology to the forefront."

Jim McDonough, who studies problems like airflow in coal mines and how forest fires spread, collaborated with Griffioen on a proposal to NSF based on the potential for the Metaverse in his research and teaching.

"Most of us who do computational fluid dynamics have recognized for years that as computing power permits us to solve increasingly more difficult problems, we'll need better ways to analyze and understand computed results, simply because we'll be getting more data from a simulation," says McDonough. "Just as we have moved from leafing through piles of computer paper output in the 1980s to flat-screen scientific visualization in the 1990s, we are confident that we must now move to fully immersive, interactive, 3-D methods to be able to understand what the results of a simulation are telling us. The Metaverse will provide such capability at a reasonable cost."

McDonough is also planning to use the Metaverse to teach an undergraduate course in basic fluid mechanics. "This is typically one of the more difficult areas for undergrads to comprehend, and most of us who teach it feel it would be valuable to provide laboratory demonstrations during lectures, possibly as part of every lecture. But this isn't feasible; it's too expensive, too time consuming, and in some cases, even dangerous. We believe the Metaverse will provide us with a tool that will allow easy, inexpensive display of the same things that a laboratory experiment would give us." As part of the NSF proposal, Joan Mazur, UK College of Education, and her students will evaluate the effectiveness of teaching with the Metaverse.

The next step for Seales and Jaynes will be installing Metaverse portals around campus, including one in the new mechanical engineering building, where McDonough will teach, and the William T. Young Library. "I believe that five years from now we'll have people teaching classes in front of a big immersive display. It will use gesture recognition, so they can hit a virtual button on the screen to advance their PowerPoint, and that lecture will be broadcast through the Metaverse to our CoRE lab and to the library so others can participate. If, by the end of five years, people are requesting time to come in and use these immersive displays, I'll feel like we've been successful," Jaynes says.

UK recently received three significant grants from the National Science Foundation to support the next generation of the Metaverse.

$1.27 million grant to fund infrastructure development and research into techniques to support three physically separate, networked visualization laboratories and two new technical staff

$1 million grant partners UK, the University of Puerto Rico and Museo de Arte de Puerto Rico to fund research into acquisition, representation and display of digital collections

$375,000 grant to fund further research into configurable display systems for digital libraries and museums

Networking and New Faces

While UK is not the only university exploring immersive displays, the unique strength of the Metaverse and the UK LAN is that it brings together researchers working at all levels of networking, ranging from the multimedia applications that use the network to link projectors, cameras, and computers, to the network protocols used to carry the data.

"Our whole paradigm depends on at least a local-area network to make it work," says Seales. "If you want to go to the next level, where you're sending information back and forth to other Metaverse sites, you're looking at a wide-area network. I understand the data and algorithmic requirements, but the transport is beyond me, so it's absolutely critical that we're collaborating with network experts."

Photo of Jim GriffioenJim Griffioen is the director of the UK Laboratory for Advanced Networking.

Griffioen credits unique collaboration between LAN researchers on the second floor of the Hardymon building and the UK network administrators on the first floor as vital to the future of the Metaverse. "Having the people who manage real networks so close by has been very beneficial," Griffioen says. "They are able to provide insight into problems we might encounter as we deploy these futuristic environments."

"Working with Doyle Friskney (associate vice president of information systems) and others in charge of UK's networking infrastructure gives us unique access to the campus network as an experimental network," Jaynes says, and points out that this kind of collaboration was one of the reasons he was drawn to UK. "I was looking for an environment that allows me to incubate ideas in the presence of other people, and the networking people here are very valuable to me in that step." And, like Jaynes, many of those networking gurus were recruited to UK through the Research Challenge Trust Fund (RCTF).

The initial phase of RCTF, a program established by the 1997 Kentucky Postsecondary Education Act, included $16 million to support new faculty, graduate students and staff. As one of the 11 programs that UK targeted for Research Challenge Trust Fund money, Computer Science and Electrical Engineering hired nine new faculty, including LAN members Ken Calvert, Christopher Jaynes and D. Manivannan, and affiliate members Alexander Dekhtyar, Hank Dietz, Zongming Fei, and Jane Hayes.

Pumping Up the Network
"The Metaverse introduces some very difficult networking problems," says Griffioen, a UK faculty member since 1991. "The amount of visual information generated by the Metaverse is massive, and even today's high-speed networks are largely incapable of delivering this type of data back and forth."

Griffioen earned his Ph.D. from Purdue University and spent time at AT&T Research Labs in New Jersey studying problems involved in delivering TV across the Internet. His focus is network protocol design (a protocol is the "language" a computer uses to communicate with other computers over a network). Griffioen works closely with Ken Calvert, who was recruited in 1998 from the Georgia Institute of Technology in Atlanta, and together they have grants from the Defense Advanced Research Projects Agency (DARPA), NSF, Intel, and Cisco. Calvert's expertise is in "active networking," a concept he sums up with the phrase "putting the 'work' in network."

He goes on to explain that the Internet relies on routers, the baggage handlers or mailmen of the network. Routers process each packet (each chunk of information) by looking at the "zip code" and sending the packet out the appropriate line to its destination.

"I tell my students the reason we build networks is to share resources, but there's a whole infrastructure that exists simply to schlep bits from here to there," says Calvert, an associate professor of computer science who earned his master's at Stanford and his Ph.D. at the University of Texas-Austin.

"The idea of active networking is to strategically take advantage of the ever-decreasing cost of computers by putting additional capabilities in the network so it can do more than just move your bits." Calvert is among a growing group of engineers looking for capabilities to add to networks that will give, as he describes,"the biggest bang for the buck, while preserving the characteristics that made the Internet a success—simplicity and robustness, for example."

In 1997 Calvert and his Georgia Tech colleagues set out to write a game plan for active networking, backed by funds from DARPA, the group that wrote the early Internet protocols. What they came up with was an "architectural" document that spells out the different components of active networking. "When we started, a lot of people were going in different directions, using different vocabularies and assumptions," Calvert explains. "Our document defined a set of shared assumptions and defined the pieces of the system we were trying to build so people could point to it and say, 'That's the piece I'm working on.'"

The key components of an active network are the Node-OS (Node Operating System), a sort of software that controls access to the hardware and transmission channels, and Execution Environments (EEs), which interpret packets.

Photo of Ken Calvert and Zongming FeiKen Calvert (left) and Zongming Fei are involved in networking research to optimize the collaborative potential of the Metaverse.

One of the research projects Griffioen and Calvert are tackling deals with injecting code into the network that merges video streams together, thereby reducing the total bandwidth required. "In this experiment we're sending multiple video streams towards a receiver. As they encounter bottlenecks in the network, where they can't all squeeze through, we've put code inside the network that, on-the-fly, takes multiple streams and transcodes them into a single stream by discarding the least important data," Griffioen says. "This produces the best-quality video given the restraints."

The performance of the Internet and the Metaverse are Zongming Fei's primary research concern. Fei, who recently received his Ph.D. from Georgia Tech, came to UK as an assistant professor in 2000. His work centers around the problem of server location. "When you go to a Web page, say cisco.com, there's not just one machine out there that has all of those pages. That site's actually on machines scattered all over the place and the network figures out which one you should connect to," Griffioen says. Fei is looking at ways to determine which server will have the best response time, and is even involved in strategies to help companies that are ready to install servers figure out where they'll get the best bandwidth.

"The ultimate goal of this research is shorter response times for clients. From the server perspective, if the load on servers is more evenly distributed, the service time will be reduced and everybody can work faster," Fei says. "I am designing a framework, or protocol, that clients can use to specify the requirements and a mechanism that can automatically locate the server with the best performance."

This work relates directly to the Metaverse, Fei says. For multiple Metaverse sites, you'll need multiple servers, he says, and at each site you'll need to locate a server that can provide the best performance for seamless graphics and real-time interaction.

How long will it take for the Metaverse to become pervasive?

Griffioen doesn't have a crystal ball, but he says that once projectors fall into the price range of desktop monitors, we'll probably see large-scale use in personal computing. "Once you've worked in an immersive environment, you immediately start to see the benefits," he says. "When students come into your office to discuss their project with you, it's one thing for them to look over your shoulder at the monitor. It's a whole other thing for it to be as big as life on the wall where you can sit down and share the viewing experience."

As far as the network speed needed to run the Metaverse, Griffioen says it's coming. "Ten-gigabyte ethernet is right around the corner, so within just a few years' time we'll have much higher speed bandwidth available and that will facilitate the type of communication we'll need for the Metaverse. With prices of projectors dropping and network bandwidth increasing, the future is within reach."

Alicia P. Gregory

Sidebar: University of Puerto Rico Students Spend Summer in Lexington