Ultra-Converged Edge AppliancesPurpose-Built for Edge Computing and Edge AI.

Edge Computing Appliance

The deployment of cloud native applications has resulted in significant challenges for enterprises and service providers. Most application software was developed for deployment in datacenters and assumed that the virtualized applications are deployed on preexisting server/network infrastructure.

Virtualized applications for network infrastructure functions, such as Firewalls, vCCAP, SDN-WAN, etc., offer the significant reduction in costs, but require a high-performance infrastructure in support of the Virtual Functions (VF)  deployed on COTS servers. Additionally, these VF’s may be deployed in non-datacenter locations, such as headends, central offices, etc .

Edge computing applications are now being deployed in locations that were never designed for IT systems deployments, such as retail stores, oil rigs, network closets, manufacturing floors etc. These locations have limited power, space and cooling, and usually lack onsite skilled IT resources.

Aparna Systems is pioneering composable “Edge Cloud” appliances through the ultra-convergence of compute, GPU, storage and networking resources in compact, energy-efficient and open software appliances. To meet the physical needs at the edge, Aparna’s Edge Computing systems are also designed to operate over a larger operating temperature range and in harsher environments.

The µCloud 4015 Edge Cloud

  • The µCloud™ 4015 system supports up to 15 µServers in a 4U chassis
  • Multiple models of µServers with 8, 12 or 16-core Xeon CPUs, up to 128 GB of DDR RAM, and optional dual SATA or NVMe SSD storage
  • The Aparna μCloud appliance’s embedded network operating system supports feature rich networking stack (L2/L3/MPLS)
  • Workload composable to support network functions, such as Firewalls, SD-WAN, vRAN, vCCAP, etc.
  • Support for bare metal, containerized and/or virtualized server clusters for single- and multi-application environments
  • High performance, highly availability design with non-blocking system-level throughput
  • Low total cost of ownership with its disruptive reductions in CapEx and OpEx, setting a new price/performance standard for the industry

Edge Artificial Intelligence (AI) Appliance

Actionable AI is moving from deep learning to deployment of trained neural models on AI edge appliances. The Edge AI applications are being deployed for industrial IoT, autonomous transportation, surveillance, retail automation, augmented and virtual reality (AR/VR), AI-assisted medical imaging, and other inference applications.

This deployment of AI at the edge demands products and platforms that are purpose-built for deployment outside of traditional datacenter and Operational Technology (OT) locations. The environmental requirements vary significantly across the applications in retail settings, manufacturing floors, cell sites, etc. This has created a demand for integrated systems with optimized interactions among CPU, GPU, memory, storage and networking. These systems cannot be mere adaptions of the datacenter servers, but must be purpose-built to operate beyond the datacenter/office temperature ranges and  be energy efficient and compact with flexible mounting options.

Aparna GX2 Edge AI Appliance

The Aparna Systems GX2 is an Edge AI Appliance that converges CPU, GPU, memory, storage and networking in a compact form factor that is purpose-built for deployment outside the datacenter in OT environments. The GX2 Edge AI Appliance can also be deployed in OT locations that have constraints on power/cooling/space and need to operate over a wider range of temperatures. The GX2 includes:

  • Servers with 8, 12 or 16-core Xeon CPUs, up to 128 GB of DDR RAM, and dual SATA SSD storage
  • Up to two NVIDA T4 GPUs for a total of 16.2 TERAFLOPS
  • Integrated L2 switch with 1GE PoE and 10GE ports
  • Workload composable for edge inference applications, such as:
    • Surveillance
    • Retail Automation
    • Industrial IoT
    • Augmented and Virtual Reality
    • Multi-access Edge Computing (MEC)

"Aparna’s open software architecture and support for ONIE [Open Network Install Environment] make the µCloud system an ideal platform for our OcNOS software."

Atsushi OgataPresident and CEO at IP Infusion

"Aparna’s Cloud-in-a-Box has the potential to be a real game-changer in a variety of applications, particularly at the edge of the network where organizations have struggled to find a practical and affordable way to deploy adequate resources."

Michael HowardSenior research director and advisor for Carrier Networks at IHS Markit

"The industry has adopted a cloud-first strategy, and the expansion of cloud to the edge of the communication service provider infrastructure favors a compact system like Aparna’s Cloud-in-a-Box."

Lee DoylePrincipal Analyst at Doyle Research

"Our network performance analytics platform requires us to detect problems and identify root causes in real-time, and Aparna’s Cloud-in-a-Box enables us to do that in a self-contained system with a remarkably small footprint."

Dr. Charles BarryCo-founder and chief technical officer at Jolata

"The Aparna Systems platform is extremely well-positioned and well-timed to capitalize on the enormous growth expected in the virtualization of network functions and services in a distributed cloud architecture."

Jim MetzlerPrincipal Analyst at Ashton, Metzler & Associates

Press Release

Aparna Systems Emerges from Stealth Mode to Announce the Industry’s First Open Software “Cloud-In-a-Box” Solution

In The News

Next-Generation Server Clusters: Boxes Without Bottlenecks

This article explores the underlying causes of performance bottlenecks in server clusters today and introduces the component-level architecture as a possible solution.