Solving the power consumption problem while delivering for modern workloads

Parallel ARM Node Designed Architecture (PANDA)

A Revolutionary Approach to Server Design

Bamboo Systems’ patented Parallel ARM Node Designed Architecture (PANDA) was purposely created to deliver high density throughput computing using embedded systems methodologies, significantly reducing the thermal profile of the server. Everything from processor selection, through network and storage connections maximizes capability while reducing the power consumption and thus the heat generated which, in turn, enables industry leading compute and thermal density reducing the space required. Bamboo servers are for modern workloads and can be deployed anywhere from the Edge to the Hyperscale Data Center.

Parallel ARM Node Designed Architecture (PANDA)

PANDA

Bamboo Systems’ patented architecture has been created utilizing embedded systems methodologies, to maximize compute throughput for modern micro-services-based workloads while using as little power as possible and generating as little heat as possible.

Bamboo servers offer:​

  • Up to 10 times the compute density by reducing power consumption and reducing heat, by:​
    • Removing unnecessary peripherals​
    • Turning off individual application processors when possible​
    • Minimizing indirect connections, such as ribbon cables
    • Using 48v power supplies, saving unnecessary and power hungry AC/DC/AC conversions
  • Increased security and maximized the capability of the application processor by separating I/O processing from application processing
  • Significantly reducing memory bottleneck by having a high ratio of memory channels to cores
  • Directly attached NVMe storage to every application processor to reduce the need for large amounts of DRAM required, increasing overall performance while reducing power consumption and cost
  • Built-in CPU-assisted smart NICs with encryption/decryption and compression/decompression offloading work from the application processor
  • Bursting up to 50GB on a single network port into each application processor
  • Embedded network switching to significantly reduced cable management​ and further costs for additional switches within a rack

Extremely High Throughput Computing delivered by an unparalleled ratio of memory controllers to cores reducing the memory bottleneck.

PANDA-based servers also offload the network and storage I/O to a separate processor, making the system more secure and enabling data locality by giving direct access to large amounts of storage.

Why It Matters in Modern Software Design

Why It Matters in Modern Software Design

PANDA-based servers are designed for microservices-based software architectures rather than monolithic software design and are able to run Kubernetes right out of the box.

The architecture means each application has unique resources for better throughput and is more secure as network I/O runs in a separate processor.

Parallel throughput is further improved with locally attached storage.

Why It Matters to the Planet

Hyperscalers have been driving towards net-zero data centers, but have concentrated on resolving the symptom of the problem such as how to use renewable energy, how to reduce the amount of electrical power used in datacenter infrastructure and, finally, how to re-use the heat generated. They, however, do not resolve the basic issue of heat generated.​

PANDA-based servers are designed specifically to solve this problem – the generation of heat. We use between 95% and 80% less electricity for a given workload, thereby generating less heat.​

Imagine how much further those data center infrastructure strategies will go when used in combination with a PANDA-based server.

Why it matters to the planet