The Daily Gamecock

USC research computing infrastructure ‘close to capacity’

System updates planned for next year

The computer systems USC researchers use have grown fragmented, too slow and too small — issues it plans to address over the next year, university administrators said.

The details of those changes, though, are still uncertain. Bill Hogue, USC's vice president for information and chief information officer, said the university hopes to draft a plan this summer, get faculty and student feedback and begin to implement it in September.

Hogue also said he wasn't sure what the plan would cost — or what budget would be available to bring it to fruition.

The problem stems in large part from the increasing needs and the "truly massive" volume of data researchers are creating, Hogue said.

Tom Vogt, associate vice president for research, said researchers are increasingly generating huge, complex sets of data, like entire genome sequences, and they're expected to make them available to the public online.

"The requirement to make them publicly accessible ... has driven our cyber infrastructure close to capacity," he wrote in an email response. "We are sailing hard against the wind, so to speak."

But the system has also struggled with a fragmented structure caused by incremental fixes, said Vogt, who is heading his office's efforts to update the system.

For years, the infrastructure for research computing has been built around individual departments and programs. When those systems needed updates, they were done bit by bit.

The result: a patchwork system that's struggling to satisfy researchers' needs.

Now, updating that infrastructure represents one of the Office of Research's top 10 goals, according to Prakash Nagarkatti, USC's vice president for research.

To build a more cohesive system, Hogue said, USC is likely to move toward a university-wide system that is linked to a "cloud," meaning it will be increasingly wired into national computing networks, following many of the recommendations of a study it conducted three years ago.

As computer systems worldwide increasingly move to cloud-based structures, the university no longer needs to focus on buying memory or resources, which can prove expensive.

"I actually believe that a lot of what our future involves is linking to external resources rather than making massive investments internally," Hogue said. "I think we need to invest in making sure our researchers can get access to computing cycles and to massive amounts of storage."

Among the investments, Hogue said, will likely be the hiring of more staff who are well-versed in research computing technology to assist researchers, hires that could run a steep cost. How many new staff USC presently needs, he said, is not yet clear.

"People who do a good job of providing research computing support, who are hired, for example, to support faculty research teams with their computing knowledge, are rare," Hogue said. "They are expensive, and no university I am aware of, including great universities in research arenas like MIT ever has enough of those kind of people."


Comments