The University of Texas at Austin and tech giant Cisco struck a ラーメンベット 評判 agreement that will begin with an emphasis on artificial intelligence and machine learning before branching out into additional technologies in the future.

As part of the five-year partnership, Cisco ラーメンベット 評判 will provide funding and expertise for four AI/ML ラーメンベット 評判 projects and one cybersecurity ラーメンベット 評判 project over the next year. The researchers from the ラーメンベット 禁止ゲーム of Engineering and College of Natural Sciences will delve into several different areas of AI and ML, including Internet of Things, computer vision, training learning networks and more.

The goal is for Cisco and UT Austin to add more projects to the alliance on a yearly basis. Other areas of emphasis the organizations are looking at include natural language processing, augmented/virtual reality, cybersecurity, edge computing and more.

“We work closely with our strategic ラーメンベット 評判 partners to understand their unmet needs and we identify mutual areas primed for foundational and applied ラーメンベット 評判 collaborations,” said John Ekerdt, associate dean for ラーメンベット 評判 at the ラーメンベット 禁止ゲーム and a professor in the McKetta Department of Chemical Engineering. “We are excited to welcome Cisco as our latest strategic ラーメンベット 評判 partner to UT and the ラーメンベット 禁止ゲーム of Engineering.”

“Cisco ラーメンベット 評判 conducts and fosters cutting-edge ラーメンベット 評判 in areas of strategic interest to Cisco. Since 2020, we have started partnering with several top universities with broad ラーメンベット 評判 strengths to fund and work with faculty on interesting projects that can make a material impact to Cisco’s future innovations. As part of this initiative, we are proud to have entered this partnership with UT Austin, leveraging which we are hoping to fund several ラーメンベット 評判 projects over the next few years.” said Dr. Ramana Kompella, the Head of Cisco ラーメンベット 評判.

Here is a look at the first five projects included in the new partnership:

Radu Marculescu, a professor in the Department of Electrical and Computer Engineering, will develop a new approach to handling issues of data, latency and power usage in large networks of different types of devices using a federated learning algorithm. Federated learning allows many devices on a network to collaboratively learn training and prediction models while keeping all the data confined to individual devices. The ラーメンベット 評判 aims to improve the scalability and performance of federated learning, especially when it comes to large dynamic networks.

Aditya Akella, professor of computer science in the College of Natural Sciences, is tackling challenges that come with large-scale AI networks in multi-tenant setting — where single copies (instances) of software applications serve many customers. The size of these models can create network bottlenecks that slow down scaling and speed of training them. Akella plans to develop a “communication substrate” that can eliminate these network bottlenecks and make sure information gets transferred quickly to facilitate the growth of these ラーメンベット 評判 models.

Electrical and computer ラーメンベット 評判 assistant professor Sandeep Chinchali is developing a ラーメンベット 評判 platform for Internet of Things devices to improve their computer vision capabilities by pulling data from their own video streams for training purposes. The aim is to build new machine ラーメンベット 評判 algorithms that can automatically pick out valuable training data from these IoT devices. And the researchers aim to balance accuracy with cost of network bandwidth that the process will involve.

Computer science professor Hovav Shacham is zeroing in on a security vulnerability that comes with using a third-party library to add features to a software system. The key to rectifying it is restricting the memory the library can read and write. WebAssembly, a type of code that runs in modern web browsers, is a strong candidate to overcome this vulnerability, according to the researchers, because it is already restricted in terms of the memory processes it can access.

Zhangyang “Atlas” Wang, an electrical and computer engineering assistant professor, is examining smaller sub-networks within larger learning models that require significant resources to run. These smaller subnetworks require less resources, and Wang’s ラーメンベット 評判 has found that they can perform as well as larger networks. Essentially, researchers could have used smaller subnetworks from the very beginning, but they didn’t know which ones to choose. Wang’s ラーメンベット 評判 will focus on finding the right subnetwork for different tasks.