CogniFiber Hits Landmark Interface Speed, Enabling 5X Faster AI Computing Than the Leading Photonics Solution

Dr. Eyal Cohen

Cognifiber’s photonics breakthrough brings closer the reality of AI deployments. “Our current results are outstanding, demonstrating capabilities to support future scalability and sustainability of data centers and edge computing,” said Dr. Eyal Cohen, Co-founder & CEO of CogniFiber.

Tel Aviv, Israel – CogniFiber, a deep technology company revolutionizing photonic computing, announced an interface data injection landmark in its product integration tests. CogniFiber’s approach implements direct analog neuromorphic photonic computing (“pure-photonics”), removing all bottlenecks from AI inference so that the task speed is limited only by the input clock.

Demonstrating capabilities 40% beyond initial projections, their soon to be launched pure-photonic system is expected to reach a staggering speed of 100 million tasks per second, far surpassing NVIDIA’s DGX A100’s estimated 5 million tasks per second or Lightmatter’s Envise Server of 24 million tasks per second (est.*). These capabilities, in conjunction with a modest power consumption, allow companies to deploy powerful computing resources, boasting just shy of 350 thousand tasks per watt. As traditional silicon-based semiconductor developers work to fit ever-more transistors into a small chip to boost processing power, data centers and hyperscalers are left managing high costs due to lower reliability, high power consumption, and cooling constraints.

To address these needs, CogniFiber developed a full solution based on proprietary fiber-based technology together with standard optical communication devices. This requires a fraction of the space to achieve similar computing power, while significantly reducing cooling and operational overhead. “Our initial findings are outstanding, displaying the capabilities to support the mega data centers of tomorrow,” said Dr. Eyal Cohen, Co-founder & CEO of CogniFiber. “Operating at room temperature without dissipating significant heat to its surroundings means data centers can offer customers greater uptime reliability with lower cost.”

The push for miniaturization of today’s servers is allowing engineers to begin processing large pools of IoT and AI data in record time in a cost-effective manner. This also holds commercial implications for businesses who want to have on-prem or localized servers that take up less space. “Our new approach to developing and harnessing large data processing capabilities allows for companies to begin bringing AI and Machine Learning capabilities to the edge of their networks,” said Professor Zeev Zalevsky, Co-founder & CTO of Cognifiber.

Beyond computing capabilities, power consumption has been a top-of-mind issue for data centers for years. As demand grows, operators are under the microscope regarding the environmental impact of their operations as they invest heavily in the energy needed to cool and operate thousands of advanced servers. CogniFiber’s pure-photonics systems and their reduced energy demands show great promise in reducing global greenhouse emissions while continuing to provide the advanced capabilities needed for tomorrow’s technologies.

*Estimates were based on DLRM benchmarks published by MLPerf.

Leave a Reply

Your email address will not be published. Required fields are marked *