Introduce inference systems distributed on AMD with great efficiency and reveal collaborations with Testorrent and Sglang
Seoul, South Korea And Santa Clara, California.,, September 11, 2025 / Prnewswire / – MorehAn AI infrastructure software company unveiled its inference system distributed on AMD and presented the progress of its collaborations with Testorrent and Sgang at the AI infra 2025 summit Santa Claraheld from September 9 to 11.
The Summit of AI infra is the largest and most established AI conference in the world dedicated to the AI infrastructure layer and automatic learning. Originally as an AI material summit in 2018, the summit went from a conference focused on semiconductors in a full AI infrastructure event.
The 2025 summit attracted 3,500 participants and more than 100 partners, with content designed for equipment suppliers, hyperscalers and all specialists in computer infrastructure and AI who build a fast, efficient and affordable AI.
To the corporate AI session on September 10The CEO of Moreh, Gangwon Jo, presented the company’s distributed inference system and presented reference results demonstrating that it has optimized the latest in -depth learning models, such as Deepseek, more effectively than Nvidia. He also unveiled a new generation AI semiconductor system combining the Morsh software with Tensorrent hardware, offering a range of competitive alternatives to Nvidia.
During the summit, Moreh co-organized a presentation with SGLANG, a leader in the ecosystem of Deep Learning inference software, and organized a stand and networking sessions together. This is an opportunity to further strengthen collaboration with the World AI ecosystem, especially on the North American market. In addition, Moreh plans to jointly develop a distributed inference system based on DMLA with SGLANG to accelerate its expansion of the inference in conference of in -depth learning rapidly growing.
The CEO of Moreh, Gangwon Jo, said: “Moreh has the strongest technical capacities among the global Software partners of AMD and currently conducts proof of concept (POC) with several leading LLM companies”, and added: “Thanks to a close collaboration with AMD, Téstorrent and Sglang, we aim to settle as a worldwide company providing customers alternatives of understanding AI diverse. ” “.
Moreh is developing its own basic AI infrastructure engine and, through its Foundation LLM Filial Technologies Moved, guarantees complete technological capacities that cover the domain of the model. Simultaneously, the company makes its brand on the global market thanks to collaborations with key partners such as AMD, Téstorrent and Sglang.
Show original content to download the multimedia:https://www.prnewswire.com/apac/news-releases/moreh-and-sglang-team-po-showcase-shtributed-inference-onference-on-amd-at-infra-summit-2025-302553303.html
Source Moreh