Dell AI Factory with NVIDIA Delivers Proven Path to Enterprise AI ROI

Dell Technologies marks the two-year anniversary of the Dell AI Factory with NVIDIA by announcing advancements across its AI data platform, end-to-end AI infrastructure, and AI solutions and services portfolio that help enterprises move AI from pilot to production at scale. With over 4,000 customers deploying the Dell AI Factory, and early adopters seeing up to 2.6x ROI within the first year, Dell proves that an end-to-end approach delivers measurable business results.

Why This Matters

The enterprise AI landscape is undergoing a fundamental shift. As AI code assistants and agentic workflows drastically lower the cost and time to build custom applications, CIOs are increasingly choosing to develop AI capabilities in-house, on-premises—driving the need for owned infrastructure.

Yet unclear ROI remains the top obstacle preventing AI deployments at scale. Two years of the Dell AI Factory with NVIDIA has revealed three critical requirements for achieving measurable returns: data platforms that make enterprise information AI-ready, infrastructure that seamlessly scales the latest innovations efficiently from pilot to production, and solutions and services that compress time to value by simplifying deployments and accelerating ROI. Dell is the premier provider delivering all three with NVIDIA technology at the core, creating a proven path from AI investment to business outcome.

Three Capabilities That Define Enterprise AI Leadership

As the top AI infrastructure provider, Dell’s AI infrastructure portfolio—the industry’s broadest—delivers integrated capabilities across data, infrastructure, solutions and services.

Data platforms that turn institutional knowledge into AI fuel

AI is rapidly shifting from assistive tools to autonomous, agentic systems, but its effectiveness is constrained by the data it can access, trust and act upon. The Dell AI Data Platform with NVIDIA addresses this challenge with a unified platform for AI that combines Dell’s high-performance storage, modular data engines, and NVIDIA accelerated computing, networking, software and CUDA-X libraries. As the data foundation of the Dell AI Factory with NVIDIA, it handles workloads from retrieval-augmented generation (RAG) and multimodal search to agentic workflows and large-scale data processing. Advancements announced today make it faster and easier for companies to turn data into real AI results.

Infrastructure that enables AI workflows from desktop to data center

Dell’s next-generation infrastructure supports AI workflows at every stage, from rapid prototyping to production deployment at scale.

For desktop AI development and autonomous agents:

For production AI at scale:

  • PowerEdge XE9812 is Dell’s flagship liquid-cooled server leverages the NVIDIA Vera Rubin NVL72 platform for massive real-time training and inference.
  • PowerEdge XE9880L, XE9882L, and XE9885L are liquid-cooled servers featuring NVIDIA HGX™ Rubin NVL8 designed to accelerate validated AI performance within existing data center footprints and power constraints.

For enterprise workloads in the data center:

For high-performance networking and emerging technologies:

  • Dell PowerSwitch SN6000-series are NVIDIA Spectrum-6 Ethernet switches with 1.6Tbs, liquid cooling and co-packaged optics options for Vera Rubin-based Dell platforms.
  • PowerSwitch SN5610 and SN2201 now offer expanded network OS choices including Cumulus Linux and Enterprise SONiC Distribution by Dell Technologies.
  • NVIDIA Quantum-X800 InfiniBand Q3300-LD liquid-cooled switches deliver high-bandwidth networking for AI and cloud-native workloads.
  • Dell Integrated Rack Scalable Systems (IRSS) expands to include Dell PowerSwitch and NVIDIA liquid-cooled switching, providing unified, rack-level power and cooling management for AI infrastructure.

NVIDIA NVQLink and NVIDIA CUDA-Q support – Dell is the first OEM to integrate NVIDIA NVQLink with CUDA-Q across PowerEdge servers featuring NVIDIA AI infrastructure, allowing enterprises and research institutions to explore emerging quantum-classical computing use cases. These capabilities accelerate discoveries in advanced drug development and materials science simulations by combining the processing power of Quantum Processing Units with NVIDIA accelerated computing for quantum systems control and error correction on a trusted foundation of Dell PowerEdge servers.

Solutions and services that accelerate deployment and prove ROI

Updated Dell AI Solutions combine new modular architecture with Dell Automation Platform blueprints and NVIDIA AI Enterprise software to deliver enterprise outcomes while simplifying operations and reducing deployment complexity. New services bridge skill gaps and scale deployments from experimentation to production.

 

Accelerating enterprise AI workloads:

  • Knowledge assistant provides the foundation for designing, deploying and managing intelligent assistants, working with industry leaders like Aible, Cohere’s North and NVIDIA.
  • ClearML blueprint improves agentic AI environments for enterprises with secure, efficient GPU cluster management and workload scheduling.
  • Agentic AI platform, in collaboration with Cohere’s North, DataRobot and NVIDIA allows enterprises to securely deploy and manage AI agents with orchestration, governance and observability.
  • Dell Accelerator Services for Agentic AI provide packaged capabilities to support businesses at any stage, from experimentation and validation to enterprise-wide integration, closing skill gaps and reducing technical complexity.

 

Simplifying AI infrastructure deployment:

  • Dell AI Factory with NVIDIA modular architecture offers a clear, simplified path to enterprise AI by addressing deployment complexity, managing rapid technology change and supporting continuous adoption. Integrated automation gives organizations the flexibility to start at the right size and scale as needs evolve.

 

Michael Dell, chairman and chief executive officer, Dell Technologies: “Two years ago, enterprises were asking how to access AI technology. Today, they’re asking how to make their data AI-ready, how to operationalize AI at scale and how to prove ROI. The Dell AI Factory with NVIDIA answers all three questions. We’re brought in from the start as a trusted advisor, helping customers navigate their entire AI journey—from turning raw data into AI fuel, through deployment and to measurable business outcomes.”

Jensen Huang, founder and chief executive officer, NVIDIA: “AI infrastructure is being built everywhere — every company will be powered by it, every country will build it— and it demands integrated data platforms, scalable infrastructure and deployment expertise. Dell Technologies delivers all three, with NVIDIA at the core. The Dell AI Factory with NVIDIA is a proven infrastructure blueprint for every phase of AI powering the next industrial era.”

Availability

  • Dell Pro Precision 5 and 7 Series mobile workstations with NVIDIA RTX PRO Blackwell GPUs will be available in May.
  • Dell Pro Precision 9 T2/T4/T6 will be available in May.
  • Dell has shipped Dell Pro Max with GB300 to select customers in March 2026, with plans to ship more broadly in the coming months.
  • Dell PowerEdge XE9812 will be globally available 2H 2026.
  • Dell PowerEdge XE9880L, XE9885L will be globally available Q3 2026.
  • Dell PowerEdge R770, R7715 and R7725 with NVIDIA RTX PRO 4500 Blackwell Server Edition GPUs are globally available now.
  • Dell PowerEdge M9822 AND R9822 will be globally available in September.
  • Dell PowerSwitch SN6000-Series will be globally available starting in July.
  • Dell SONiC with Spectrum-based PowerSwitch SN5610 and S2201 will be globally available in March.
  • NVIDIA Quantum-X800 Q3300-LD will be globally available by Dell Technologies in Q4 2026.
  • Dell PowerEdge NVIDIA NVQLink and CUDA-Q integration is available now.
  • Knowledge assistant is globally available now.
  • Agentic AI platform with Cohere’s North and DataRobot are available now, agentic AI platform with ClearML will be available in March.
  • Dell AI Factory with NVIDIA modular architecture will be globally available in April.
  • Dell Accelerator Services for Agentic AI are available now.

OBSBOT Tiny 3 Tiny 3 Lite and OBSBOT Vox SE First Look

OBSBOT, an industry leader in smart videography technology, today announces the global launch of the OBSBOT Tiny 3 and OBSBOT Tiny 3 Lite, a new generation of AI-powered 4K PTZ webcams designed around one core belief: delivering studio-grade audio and outstanding video in one seamless experience. 

The Tiny 3 Series sets a new benchmark for professional webcams by combining studio-grade spatial audio, flagship-level imaging, and industry-leading AI intelligence in an ultra-compact aluminum-alloy body. Designed for content creators, podcasters, remote professionals, educators, and live streamers, the series delivers premium sight and sound—without external microphones, cable clutter, or complex setups.

“With traditional webcams, audio quality is often an afterthought,” said Liu Bo, CEO of OBSBOT. “With the Tiny 3 Series, we rethought professionals and creators’ real needs: Video quality still matters, but audio, intelligence, and ease of use are just as critical. Tiny 3 delivers flagship imaging in an ultra-compact form for creators who want the best, while Tiny 3 Lite brings the same professional audio and AI experience to everyday workflows at a more accessible level. Together, the Tiny 3 Series gives users the freedom to choose the performance that best fits how they create and communicate.”

Revolutionary Top-Tier Spatial Audio Performance

Both Tiny 3 and Tiny 3 Lite feature an industry-leading triple silicon MEMS microphone array, combining one omnidirectional and two directional microphones powered by Sisonic MEMS technology. This architecture—commonly found in premium TWS earbuds and professional audio equipment—delivers:

  • Ultra-low distortion up to 130 dB SPL
  • An exceptional 69 dB signal-to-noise ratio
  • Full-spectrum 50Hz–20kHz frequency response
  • Consistent, unit-to-unit studio-quality performance

 

The result is clear, natural, and immersive sound that rivals dedicated microphones—directly from a webcam.

To adapt seamlessly to different creative and professional scenarios, the Tiny 3 Series offers five dedicated audio profiles:

  • Pure Audio Mode – Zero processing, studio-grade capture for music recording, ASMR, and precision post-production
  • Spatial Audio Mode – Immersive stereo with enhanced left-right separation-perfect for vlogs and ambient storytelling where soundscape is key
  • Smart Omni Mode – AI-powered 360° pickup that balances vocals and ambient sound for meetings, group discussions, and multi-speaker streams
  • Directional Mode – Focus on the voice in front while cutting surrounding noise-ideal for solo presentations and podcasts in noisy spaces
  • Dual-Directional Mode – Capture clear audio from front and rear while rejecting side noise-built for interviews, two-host podcasts, and face-to-face conversations

With Voice Locator, the Tiny 3 Series doesn’t just hear you—it finds you. Using spatial audio positioning, the webcam detects the direction of speech, smoothly rotates the gimbal, and initiates tracking automatically. Combined with voice commands like “Hi Tiny,” users can start recording or meetings from anywhere in the room.  The series also supports the OBSBOT Vox SE wireless lavalier for cable-free professional audio.

Flagship Imaging in the Tiniest 4K PTZ Webcam

Tiny 3 features the smallest 4K PTZ webcam design in the world, at just 63 g and  37 × 37 × 49 mm, compact enough to disappear into any professional or creator setup. Inside this compact body is a 1/1.28-inch flagship CMOS sensor, delivering:

  • 4K@30fps video and 1080p@120fps high-frame-rate capture
  • Advanced DCG HDR for balanced highlights and shadows without motion blur
  • Wide ISO 100–12800 range for excellent low-light performance
  • Ultra-fast Dual All-Pixel PDAF autofocus for confident live demos and movement

The result is crisp, cinematic video quality that rivals dedicated cameras—now in the smallest PTZ webcam ever built.

The slightly larger Tiny 3 Lite (73g, 41×41×58mm) brings professional imaging to a wider audience, featuring a 1/2-inch stacked CMOS sensor with advanced HDR algorithms, fast All-Pixel PDAF autofocus, and excellent low-light handling (ISO 100–6400). Supporting 4K@30fps and 1080p@120fps, it delivers smooth, reliable performance in a lighter, more portable integrated-stand design.

Industry-Leading AI Tracking 2.0

Powered by OBSBOT’s latest AI algorithms—refined from its professional Tail Series—the Tiny 3 Series introduces AI Tracking 2.0, delivering precise, reliable, and customizable tracking without constant manual adjustment.

Key capabilities include:

  • Accurate tracking of individuals, groups, hands, and objects, with “Only Me” Mode for locked single-subject focus and Zone Tracking for professional framing control
  • Auto Zoom, customizable composition lines, and Face Framing for professional-looking shots
  • Voice Tracking, combining audio positioning with visual tracking for hands-free operation
  • Multiple AI Modes, including dedicated Desk Mode and Whiteboard Mode (exclusive to Tiny 3), plus Hand Tracking
  • Intuitive gesture and voice controls for fluid, interruption-free workflows

 

Powerful software for pro control

Through OBSBOT Center and Obsbot live, users unlock DSLR-level creative controls, including exposure and gamma curve adjustment, manual white balance tuning, NVIDIA Maxine Eye Contact, background blur and replacement, overlays, and more—turning any desktop setup into a compact production studio.

Two new software innovations further expand creative possibilities for presentations, streaming and creative production:

  • RTC Remote Interaction enables real-time, bidirectional audio and video at 720P with sub-200ms latency, allowing remote participants to control PTZ, zoom, and AI tracking directly via browser.
  • Virtual Camera (VR) Features, powered by the OBSBOT Toolkit, turn Tiny 3 into a motion-capture-free virtual camera system with AI Tracking, pre-built avatars and scenes, and one-click switching between virtual avatars, voice tones, and environments.

Seamless OBSBOT Ecosystem integration

The Tiny 3 Series integrates effortlessly with the broader OBSBOT ecosystem, supporting:

  • OBSBOT Vox SE wireless audio
  • OBSBOT Talent for multi-camera live streaming
  • UVC – HDMI/NDI adapters
  • OBSBOT Tiny Smart Remote 2, Elgato Stream Deck, and Switch 2 compatibility

Enhanced Sleep Mode 2.0 ensures privacy, while wide platform compatibility simplifies deployment across professional workflows.

Availability

Both models are available starting today on the official OBSBOT online store, Amazon, and select retailers worldwide.

Learn more about Tiny 3 series:

Official website link: https://www.obsbot.com/obsbot-tiny-3-series-4k-ptz-webcam

Official store link: https://www.obsbot.com/store/products/tiny-3-series

Amazon store link: https://www.amazon.com/dp/B0G636CXQM

 

Get the OBSBOT software: https://www.obsbot.com/download 

See our OBSBOT Reviews

Unboxing 

Core42 Establishes European Headquarters in Dublin

Core42, a G42 company specializing in sovereign cloud and AI infrastructure, today announced the establishment of its European headquarters in Dublin, Ireland. The news was shared at Investopia, the UAE’s global investment platform, which is hosting its global dialogue series in Dublin this week. The new headquarters strengthens Core42’s ability to serve European enterprises and governments seeking secure, high-performance infrastructure to scale AI adoption.

Core42 was founded in 2023 by G42 to build globally relevant infrastructure for large-scale AI. The company focuses on sovereign cloud, advanced compute platforms, and hyperscale AI environments that support production-grade AI across sectors. Core42 partners with Microsoft, NVIDIA, AMD, Cerebras, and other global ecosystem leaders to ensure customers have access to the latest accelerators, models, and architectures.

Through its AI Cloud platform, Core42 provides fast, self-service access to high-performance compute for training, inference, and large-scale experimentation. Its services portfolio, managed delivery functions, and AI solutioning capabilities support customers through cloud modernization, data readiness, and the full AI adoption lifecycle.

Since 2024, Core42 has expanded its European presence through a series of large-scale sovereign compute initiatives. In France, Core42 partnered with Data One and Oreus to deliver a national-scale AI infrastructure deployment in Grenoble that supports high-performance enterprise and public sector workloads. In Italy, the company collaborated with Domyn to build Europe’s largest AI compute cluster, creating a strong foundation for an AI-first economy and accelerating the region’s ability to scale advanced AI solutions.

Establishing the European headquarters in Dublin marks the next phase of this expansion. The office will act as the regional hub for customer delivery, engineering leadership, regulatory engagement, and ecosystem partnerships. It positions Core42 to work more closely with European institutions and industry leaders as demand for scalable AI infrastructure accelerates across key sectors.

Commenting on the milestone, Talal M. Al Kaissi, Interim CEO of Core42, said: “Europe is a central part of Core42’s global expansion strategy. Establishing our headquarters in Dublin gives us the operational base to support growing demand for high-performance AI infrastructure and to work more closely with customers and partners as they scale production-grade AI across key sectors.”

Also at Investopia, Core42 together with Emerging Markets Intelligence and Research (EMIR), released a report that explores the infrastructure, policy, and investment conditions required for Europe to accelerate its AI capabilities. The report draws on comparative insights from the rapid AI scale-up in the UAE and provides practical guidance for governments, investors, and enterprises developing sovereign-aligned AI ecosystems. To download the report, click here.

Core42 will begin formal operations in Dublin in early 2026, with plans to expand engineering, customer success, and partner ecosystem teams throughout the year.

Dell Technologies brings data centre-class AI to the desktop with Dell Pro Max with GB10

Dell Technologies has today announced the availability of Dell Pro Max with GB10, a new desktop system that makes it easier for anyone building AI tools to do it right from their desk. Capable of handling massive AI models, the new Pro Max with GB10 uses NVIDIA’s Grace Blackwell chip and comes with 128GB of unified memory and up to 4TB of storage to support models with up to 200 billion parameters.

The system is designed to remove long-standing barriers in AI development, allowing Irish research teams, startups, regulated industries, and individual creators to train, fine-tune, and deploy advanced models locally—without relying on cloud solutions or compressing models to fit hardware limitations. Dell Pro Max with GB10 comes pre-installed with key AI tools such as CUDA, JupyterLab, Docker, and AI Workbench, enabling teams to start building in minutes.

By bringing this level of performance directly to the desktop, Dell Pro Max with GB10 transforms how AI work is done. Academic researchers at Irish universities and institutions can test hypotheses and adapt models rapidly, accelerating discovery. Startups gain enterprise-grade computational power without heavy infrastructure investment, allowing small teams to prototype, validate, and scale AI projects efficiently. Regulated industries can deploy secure AI workflows on-premises, protecting sensitive data while maintaining performance on par with leading cloud solutions. Independent creators and developers across Ireland can now have the ability to develop sophisticated AI models from their own workspace, democratising innovation.

“Human ingenuity fuels AI progress, yet most teams hit hard limits on computation well before reaching their creative potential,” said Charlie Walker, Senior Director and GM, Dell Pro Max and Pro Rugged Products. Dell Pro Max with GB10 empowers customers to advance securely, accelerate insight, and innovate on their own terms. This isn’t just another workstation; it’s an AI accelerator for real-world AI challenges, built for those who won’t let limits define what’s possible.”

The Dell Pro Max with GB10 is designed with scalability in mind. For teams requiring even greater power, connecting two systems creates a single node capable of handling 400 billion-parameter models, showcasing Dell’s approach to scalable AI infrastructure.

By removing computational constraints and simplifying AI development, Dell Pro Max with GB10 enables faster innovation, more secure workflows, and broader access to advanced AI technology. This initiative demonstrates Dell Technologies’ commitment to empowering creators and organisations and supporting the growing AI ecosystem across Ireland to push the boundaries without compromising.

For morei information on the new Dell Pro Max with GB10, visit: www.dell.ie

Dell AI Data Platform Advancements Help Customers Harness Data to Power Enterprise AI with NVIDIA and Elastic

Dell Technologies, the world’s No. 1 provider of AI infrastructure, today announced updates to the Dell AI Data Platform to help customers better support the full lifecycle of AI workloads from ingestion and transformation to agentic inferencing to AI-powered knowledge retrieval.

Why it matters

Enterprise data is massive, growing rapidly and increasingly unstructured, but only a fraction of it is usable for generative AI today. To unlock its value, organisations need continuous indexing and a vector retrieval engine that converts content into embeddings for fast, precise semantic search. As workloads grow, organizations need infrastructure that streamlines data preparation, unifies data access across silos and delivers end-to-end enterprise-grade performance.

The latest updates to the Dell AI Data Platform enhance unstructured data ingestion, transformation, retrieval, and compute performance to streamline AI development and deployment – turning massive datasets into reliable, high quality real-time intelligence for generative AI.

Accelerating AI inferencing and analytics

The Dell AI Data Platform helps customers quickly move from AI experimentation to production by automating data preparation.

At the core of the Dell AI Data Platform’s architecture are specialized storage and data engines that help seamlessly connect AI agents to high quality enterprise data. Together, the Dell AI Data Platform and the NVIDIA AI Data Platform reference design provide a validated, GPU-accelerated solution that integrates storage engines and data engines with NVIDIA accelerated computing, networking and AI software to power generative AI systems.

Expanding the capabilities of the Dell AI Data Platform is the new unstructured data engine, designed to provide real-time, secure access to large-scale unstructured datasets for inferencing, analytics, and intelligent search. This engine, made possible through a new collaboration with open-source Search AI leader Elastic, will offer customers advanced vector search, semantic retrieval and hybrid keyword search capabilities—key capabilities for powering AI applications. Additionally, the unstructured data engine will leverage built-in GPU acceleration to deliver breakthrough performance.

The unstructured data engine works alongside the platform’s other tools, like a federated SQL engine for querying scattered structured data, a processing engine for handling large-scale data transformation, and storage designed for fast, AI-ready access.

Powering enterprise AI discovery

As AI becomes increasingly crucial for business-as-usual operations, Dell PowerEdge R7725 and R770 servers featuring NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs provide the mainstream computing foundation for accelerated enterprise workloads, from visual computing, data analytics and virtual workstations, to physical AI and agentic inference. These servers are ideal for running NVIDIA AI reasoning models such as the latest NVIDIA Nemotron models for agentic AI, as well as NVIDIA Cosmos world foundation models for physical AI.

Offering better price for performance for a wide range of enterprise use cases, these air-cooled systems make flexible high-density AI compute more attainable. The NVIDIA RTX PRO 6000 offers enterprises up to six times the token throughput for LLM inference,[ii] double the capacity for engineering simulation performance[iii] and can support four times the number of concurrent users compared to the previous generation with support for MIG.

The Dell PowerEdge R7725 server will also be the first 2U server platform to integrate the NVIDIA AI Data Platform reference design. When the Dell PowerEdge R7725 server featuring NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs is paired with the Dell AI Data Platform and its new unstructured data engine, enterprises can take advantage of a turnkey solution without the need to architect and test their own hardware and software platforms. The combination of the two delivers faster inferencing, more responsive semantic search and support for larger, more complex AI workloads.

See innovation in action at SIGGRAPH 2025

Dell Technologies is showcasing how customers can accelerate media production pipelines and power intelligent asset management at scale using the Dell AI Data Platform, NVIDIA Omniverse software and Dell infrastructure at this year’s SIGGRAPH conference (August 10-14) in Vancouver, Canada. Dell will also feature the new Dell Pro Max high-performance PC portfolio, including laptops, desktops and the upcoming Dell Pro Max with GB10, a compact AI developer workstation.

“The key to unlocking AI’s full potential lies in breaking down silos and simplifying access to enterprise data,” said Arthur Lewis, president, Infrastructure Solutions Group, Dell Technologies. “Collaborating with industry leaders like NVIDIA and Elastic to advance the Dell AI Data Platform will help organizations accelerate innovation and scale AI with confidence.”

“Enterprises worldwide need infrastructure that handles the growing scale and complexity of AI workloads,” said Justin Boitano, vice president of enterprise AI at NVIDIA. “With NVIDIA RTX PRO 6000 GPUs in new 2U Dell PowerEdge servers, organizations now have a power efficient, accelerated computing platform to power AI applications and storage on NVIDIA Blackwell.”

“Fast, accurate, and context-aware access to unstructured data is key to scaling enterprise AI,” said Ken Exner, Chief Product Officer at Elastic. “With Elasticsearch vector database at the heart of the Dell AI Data Platform’s unstructured data engine, Elastic will bring vector search and hybrid retrieval to a turnkey architecture, enabling natural language search, real-time inferencing, and intelligent asset discovery across massive datasets. Dell’s deep presence in the enterprise makes them a natural partner as we work to help customers deploy AI that’s performant, precise, and production-ready.”

Availability

  • Unstructured data engine in Dell AI Data Platform will be available later this year.
  • Dell PowerEdge R7725 and R770 servers with NVIDIA RTX PRO 6000 GPUs will be globally available later this year.

Award winning Shannon Teen to Represent Ireland in the Technovation Global Semi Finals with Dementia Support App

Chloe O’Loughlin, a 2nd year student in St. Patrick’s Comprehensive School in Shannon, won the Junior category award at the Technovation Ireland Regional Pitch Event (RPE) in AMD’s Headquarters CityWest Dublin. Chloe has been working on an app to help people with dementia since she started the Teen-Turn afterschool program in January. Twelve weeks of mentoring and support led to the big event, with Chloe winning the Junior category. 65 projects from 8 counties and more than 20 schools were submitted to Technovation from Ireland this year. Chloe will now represent Ireland in the semi-finals of the global competition.

This project was particularly personal to Chloe who had a dream to help people suffering with dementia in a simple and easy way. She said: ‘I have relatives with dementia. I see firsthand the effects that it has on them.  I wanted to think of something that would help them.’

It was this initial thought that led her to create and develop her app called Brain Track. “The app features a to-do list, calendar, to keep track of different appointments and allows relatives and caregivers to keep an eye on them.” An additional feature of the app is a profile page where the user or their relatives can input profiles of the people around them including photos and key details about that person. Chloe believes that to be a very important feature for the wellbeing of the person with dementia and a reassurance to their loved ones. “If someone with dementia were to forget someone’s name, they can go into the app and remind themselves of their name, which would save them the frustration of having to ask the person directly.”

Chloe committed to 12 weeks of staying after school with more than 10 other girls in the group to learn to code, develop a business plan and pitch to judges. Her mentor Donna O’Sullivan, who is also a teacher at the school, applauded Chloe for her commitment and dedication to the project and app. “Chloe has shown tremendous dedication and commitment to attending our Teen-Turn sessions after school each week and made superb progress during that time. We are so proud of what she has achieved and grateful to Teen-Turn for their continued support in developing tech skills for our girls.”

Chloe really enjoyed participating in Teen-Turn’s afterschool program and said “I really enjoyed working with everyone in the group. We got tons of different ideas and opinions when putting it all together and it’s great to see how the others projects were coming along.” Chloe also benefited from extra sessions held by Teen-Turn over the midterm in partnership with PayPal and on two Super Session Saturdays where the girls had the opportunity to gain in person mentorship in University of Limerick from academic staff and Immersive Software Engineering students.

On the day of the RPE Chloe said that she was a little apprehensive to pitch to the judges but that she found the experience to be a very powerful one.  “I was very nervous, but the judges were really nice and once I started, I was more excited than nervous. They genuinely wanted to find out more about the app and how it works.” Chloe was overjoyed to take first place on the day, winning a laptop with AMD chip technology, and is looking forward to the next stage of the competition.

So what does the future hold for Chloe and Brain Track? I definitely want to continue with Teen-Turn and with Project Squad and Technovation in the future because I really enjoyed it.’ says Chloe who is very excited at the thought of possibly bringing the app to the market in the future. “I think the app will help people  and I want to continue improving it.” Chloe hopes that she can make the future a little brighter for those with dementia and their loved ones.

Technovation is a tech education nonprofit that inspires people around the world to believe in themselves as leaders and become more confident, curious problem-solvers. Technovation offers interactive learning programs in which young people ages 8-18 and adults in their community learn how to use technology to solve real-world problems. Technovation partners with leading organisations like UNESCO and UN Women, and with mentors from companies like Google, NVIDIA, and Adobe to reach children and families in more than 100 countries. To learn more, visit technovation.org.

Dell Technologies Fuels Enterprise AI Innovation with Infrastructure, Solutions and Services

Dell Technologies (NYSE:DELL), the world’s No. 1 provider of AI infrastructure, announces Dell AI Factory advancements, including powerful and energy-efficient AI infrastructure, integrated partner ecosystem solutions and professional services to drive simpler and faster AI deployments.

 Why it matters

AI is now essential for businesses, with 75% of organizations saying AI is key to their strategy and 65% successfully moving AI projects into production. However, challenges like data quality, security concerns and high costs can slow progress.

The Dell AI Factory approach can be up to 62% more cost effective for inferencing LLMs on- premises than the public cloud and helps organizations securely and easily deploy enterprise AI workloads at any scale. Dell offers the industry’s most comprehensive AI portfolio designed for deployments across client devices, data centers, edge locations and clouds. More than 3,000 global customers across industries are accelerating their AI initiatives with the Dell AI Factory.

 

Dell infrastructure advancements help organizations deploy and manage AI at any scale

Dell introduces end-to-end AI infrastructure to support everything from edge inferencing on an AI PC to managing massive enterprise AI workloads in the data center.

Dell Pro Max AI PC delivers industry’s first enterprise-grade discrete NPU in a mobile form factor7

The Dell Pro Max Plus laptop with Qualcomm® AI 100 PC Inference Card is the world’s first mobile workstation with an enterprise-grade discrete NPU. It offers fast and secure on-device inferencing at the edge for large AI models typically run in the cloud, such as today’s 109-billion- parameter model.

The Qualcomm AI 100 PC Inference Card features 32 AI-cores and 64 GB memory, providing power to meet the needs of AI engineers and data scientists deploying large models for edge inferencing.

Dell redefines AI cooling with innovations that reduce cooling energy costs by up to 60%9

The industry-first Dell PowerCool Enclosed Rear Door Heat Exchanger (eRDHx) is a Dell- engineered alternative to standard rear door heat exchangers. Designed to capture 100% of IT heat generated with its self-contained airflow system, the eRDHx can reduce cooling energy costs by up to 60% compared to currently available solutions.

 

With Dell’s factory integrated IR7000 racks equipped with future-ready eRDHx technology, organizations can:

  • Significantly cut costs and eliminate reliance on expensive chillers given the eRDHx operates with water temperatures warmer than traditional solutions (between 32 and 36 degrees Celsius).
  • Maximize data center capacity by deploying up to 16% more racks of dense compute, without increasing power consumption.
  • Enable air cooling capacity up to 80 kW per rack for dense AI and HPC deployments.
  • Minimize risk with advanced leak detection, real-time thermal monitoring, and unified management of all rack-level components with the Dell Integrated Rack Controller.

 

Dell PowerEdge servers with AMD GPUs maximize performance and efficiency

 

Dell PowerEdge XE9785 and XE9785L servers will support AMD Instinct™ MI350 series GPUs, which offer 288 GB of HBM3E memory per GPU and deliver up to 35 times greater inferencing performance. Available in liquid-cooled and air-cooled configurations, the servers will reduce facility cooling energy costs.

Dell advancements power efficient and secure AI deployments and workflows

Because AI is only as powerful as the data that fuels it, organizations need a platform designed for performance and scalability. The Dell AI Data Platform updates improve access to high quality structured, semi-structured and unstructured data across the AI lifecycle.

 

  • Dell Project Lightning is the world’s fastest parallel file system per new testing, delivering up to two times greater throughput than competing parallel file systems. Project Lightning will accelerate training time for large-scale and complex AI workflows.
  • Dell Data Lakehouse enhancements simplify AI workflows and accelerate use cases — such as recommendation engines, semantic search and customer intent detection — by creating and querying AI-ready datasets.

 

“We’re excited to work with Dell to support our cutting-edge AI initiatives, and we expect Project Lightning to be a critical storage technology for our AI innovations,” said Dr. Paul Calleja, director, Cambridge Open Zettascale Lab and Research Computing Services, University of Cambridge.

With additional portfolio advancements, organizations can:

  • Lower power consumption, reduce latency and boost cost savings for high performance computing (HPC) and AI fabrics with Dell Linear Pluggable Optics.
  • Increase trust in the security of their AI infrastructure and solutions with Dell AI Security and Resilience Services, which provide full stack protection across AI infrastructure, data, applications and models.
  • Dell expands AI partner ecosystem with customizable AI solutions and applications

Dell is collaborating with AI ecosystem players to deliver tailored solutions that simply and quickly integrate into organizations’ existing IT environments. Organizations can:

  • Enable intelligent, autonomous workflows with a first-of-its-kind on-premises deployment of Cohere North, which integrates various data sources while ensuring control over operations.
  • Innovate where the data is with Google Gemini and Google Distributed Cloud on- premises available on Dell PowerEdge XE9680 and XE9780 servers.
  • Prototype and build agent-based enterprise AI applications with Dell AI Solutions with Llama, using Meta’s latest Llama Stack distribution and Llama 4 models.
  • Securely run scalable AI agents and enterprise search on-premises with Glean. Dell and Glean’s collaboration will deliver the first on-premises deployment architecture for Glean’s Work AI platform.
  • Build and deploy secure, customizable AI applications and knowledge management workflows with solutions jointly engineered by Dell and Mistral AI.

 

The Dell AI Factory also expands to include:

  • Advancements to the Dell AI Platform with AMD add 200G of storage networking and an upgraded AMD ROCm open software stack for organizations to simplify workflows, support LLMs and efficiently manage complex workloads. Dell and AMD are collaborating to provide Day 0 support and performance optimized containers for AI models such as Llama 4.
  • The new Dell AI Platform with Intel helps enterprises deploy a full stack of high performance, scalable AI infrastructure with Intel® Gaudi® 3 AI accelerators.

 

Dell also announced advancements to the Dell AI Factory with NVIDIA and updates to Dell NativeEdge to support AI deployments and inferencing at the edge.

 

“It has been a non-stop year of innovating for enterprises, and we’re not slowing down. We have introduced more than 200 updates to the Dell AI Factory since last year,” said Jeff Clarke, chief operating officer, Dell Technologies. “Our latest AI advancements — from groundbreaking AI PCs to cutting-edge data center solutions — are designed to help organizations of every size to seamlessly adopt AI, drive faster insights, improve efficiency and accelerate their results.”

 

“We leverage the Dell AI Factory for our oceanic research at Oregon State University to revolutionize and address some of the planet’s most critical challenges,” said Christopher M. Sullivan, director of Research and Academic Computing for the College of Earth, Ocean and Atmospheric Sciences, Oregon State University. “Through advanced AI solutions, we’re accelerating insights that empower global decision-makers to tackle climate change, safeguard marine ecosystems and drive meaningful progress for humanity.”

 

Dell Technologies Unveils Next Generation Enterprise AI Solutions with NVIDIA

The world’s top provider of AI-centric infrastructure, Dell Technologies announces innovations across the Dell AI Factory with NVIDIA – all designed to help enterprises accelerate AI adoption and achieve faster time to value.

Why it matters

As enterprises make AI central to their strategy and progress from experimentation to implementation, their demand for accessible AI skills and technologies grows exponentially.

Dell and NVIDIA continue the rapid pace of innovation with updates to the Dell AI Factory with NVIDIA, including robust AI infrastructure, solutions and services that streamline the path to full-scale implementation.

Dell infrastructure advances enterprise AI innovation with enhanced power, efficiency, and scalability

Dell Technologies introduces the next generation of advanced compute, data storage, data management and networking solutions:

  • Air-cooled Dell PowerEdge XE9780 and XE9785 servers simplify integration into existing enterprise data centers, while liquid-cooled Dell PowerEdge XE9780L and XE9785L servers accelerate rack-scale deployment. The new PowerEdge servers support up to 192 NVIDIA Blackwell Ultra GPUs with direct to chip liquid cooling and can be customized with up to 256 NVIDIA Blackwell Ultra GPUs per Dell IR7000 rack. As the successors to Dell’s fastest ramping solution ever the Dell PowerEdge XE9680, these platforms can deliver up to four times faster large language model (LLM) training with the 8-way NVIDIA HGX B300.4
  • The Dell PowerEdge XE9712 featuring NVIDIA GB300 NVL72 offers efficiency at rack scale for training and 50 times more AI reasoning inference output and 5x improvement in throughput With new Dell PowerCool technology, this platform helps businesses achieve greater power efficiency.
  • The Dell PowerEdge XE7745 server will be available with NVIDIA RTX Pro™ 6000 Blackwell Server Edition GPUs in July 2025. This platform – supported in the NVIDIA Enterprise AI Factory validated design – provides a universal platform to help meet the needs of physical and agentic AI use cases like robotics, digital twins, and multi-modal AI applications with support for up to 8 GPUs in a 4U chassis.
  • Dell plans to support the NVIDIA Vera CPU, offering speed, efficiency, and performance.
  • Dell plans to support the NVIDIA Vera Rubin platform with a new Dell PowerEdge XE server designed for Dell Integrated Rack Scalable Systems.

 

Connecting it all, Dell extends its networking portfolio to include the Dell PowerSwitch SN5600, SN2201 Ethernet, part of the NVIDIA Spectrum-X Ethernet networking platform, and NVIDIA Quantum-X800 InfiniBand switches. These high-density, low-latency switches deliver up to 800 gigabits per second of throughput and are now backed by Dell ProSupport and Deployment Services to provide expert guidance at every stage of AI deployment.

Dell AI Factory with NVIDIA solutions support the NVIDIA Enterprise AI Factory validated design, featuring Dell and NVIDIA compute, networking, storage, and NVIDIA AI Enterprise software providing an end-to- end fully integrated AI solution for enterprises.

Dell AI Data Platform advancements improve AI data management

Because AI is only as powerful as the data that fuels it, organizations need a platform designed for performance and scalability. Dell AI Data Platform advancements provide AI applications with always-on access to high quality data.

  • Dell ObjectScale supports large-scale AI deployments while helping reduce cost and data center footprint with the introduction of a denser, software-defined system. NVIDIA BlueField and Spectrum networking integrations boost performance and scalability.
  • Dell introduces a high-performance solution built with Dell PowerScale, Dell Project Lightning, and PowerEdge XE servers. Using KV cache and integrating NVIDIA’s NIXL Libraries, this solution is ideal for large-scale distributed inference workloads.
  • Dell ObjectScale will support S3 over RDMA, achieving up to 230% higher throughput, up to 80% lower latency and 98% reduced CPU load compared to traditional S3 for better GPU utilization
  • Dell announces an integrated solution that incorporates the NVIDIA AI Data Platform to accelerate curated insights from data and accelerate agentic AI applications and tools.

 

Software updates help organizations seamlessly deploy agentic AI

 

  • The NVIDIA AI Enterprise software platform, available directly from Dell, offers businesses the option to innovate on the Dell AI Factory with NVIDIA with NVIDIA NIM, NVIDIA NeMo microservices, and NVIDIA Blueprints, NVIDIA NeMo Retriever for RAG and NVIDIA Llama Nemotron reasoning models, and seamlessly develop agentic workflows while accelerating time- to-value for AI outcomes.
  • Simplify business-critical AI deployments while providing flexibility and security with Red Hat OpenShift available on the Dell AI Factory with NVIDIA.

New managed services streamline operations and drive faster outcomes

The new Dell Managed Services for the Dell AI Factory with NVIDIA simplify AI operations with management of the full NVIDIA AI solutions stack — including AI platforms, infrastructure and NVIDIA AI Enterprise software. Dell managed services experts handle 24×7 monitoring, reporting, version upgrades and patching, helping teams overcome resource and expertise constraints by providing cost-effective, scalable and proactive IT support.

Perspectives

“We’re on a mission to bring AI to millions of customers around the world,” said Michael Dell, chairman and chief executive officer, Dell Technologies. “Our job is to make AI more accessible. With the Dell AI Factory with NVIDIA, enterprises can manage the entire AI lifecycle across use cases, from training to deployment, at any scale.”

“AI factories are the infrastructure of modern industry, generating intelligence to power work across healthcare, finance and manufacturing,” said Jensen Huang, founder and chief executive officer, NVIDIA. “With Dell Technologies, we’re offering the broadest line of Blackwell AI systems to serve AI factories in clouds, enterprises and at the edge.”

HighPoint’s PCIe Gen5 x16 RocketStor 8631CW Breaks New Ground for GPU Expansion

HighPoint’s RocketStor 8631CW represents a revolutionary leap forward in eGPU (external GPU) connectivity. Designed for professional media, industrial applications, and high-demand computing environments, the RocketStor 8631CW leverages industry-proven PCIe Switching technology and robust CopprLink CDFP connectivity to deliver an unprecedented x16 lanes of dedicated host-to-device PCIe 5.0 bandwidth.

 

Key Features
Burdened by performance and latency bottlenecks imposed by Thunderbolt and OCuLink connectivity? This groundbreaking solution liberates users from the limitations of conventional eGPU solutions, offering unparalleled performance and flexibility.

Unleash the True Potential of Your GPU with Industry-Leading PCIe Switching Technology: The RocketStor 8631CW harnesses the power of cutting-edge PCIe Gen5 switching technology to ensure your GPU operates at its maximum potential. This technology eliminates the performance and latency bottlenecks commonly associated with traditional eGPU enclosures, providing a seamless and efficient connection between your host system and GPU.

Ready for Today’s Fastest, Power-Hungry GPUs: The RocketStor 8631CW is engineered to support the most advanced, power-hungry GPUs available, including full-height, 2-slot, and 3-slot models from NVIDIA, AMD, Intel, and more. It provides a massive 600W of direct power, ensuring that even the most demanding GPUs can operate at peak performance without any power-related constraints.

Robust CopprLink CDFP Connectivity Delivers x16 Lanes of Dedicated Gen5 Host Bandwidth: The innovative CopprLink CDFP connectivity delivers a dedicated x16 lane connection, offering a staggering 128GB/s of bi-directional bandwidth. This ensures that your GPU receives the full bandwidth it needs to handle the most intensive workloads with ease.

Low-Profile LP-MD2 Adapter for Compact Workstations and Mini-PCs: The RocketStor 8631CW includes a low-profile LP-MD2 adapter, making it an ideal solution for compact workstations, rackmount chassis, and mini-PCs. This adapter ensures that even space-constrained systems can take advantage of the RocketStor 8631CW’s powerful capabilities.

Advanced Dual-Fan Cooling System with Programmable Smart Fan Control: To maintain optimal performance, the RocketStor 8631CW features an advanced dual-fan cooling system with programmable smart fan control. This system effectively dissipates heat, preventing thermal throttling and ensuring that your GPU operates at peak performance even under heavy loads.

Simple & Seamless Plug-and-Play Software-less Installation for Linux & Windows Platforms: The RocketStor 8631CW offers a straightforward plug-and-play installation process that requires no additional software. This makes it easy to integrate into both Linux and Windows platforms, allowing for quick and efficient deployment in any professional or industrial environment.

Target Use Cases
AI/ML Computing & Deep Learning: The RocketStor 8631CW is perfect for AI and machine learning applications that require the processing power of high-end GPUs. Its robust connectivity and power delivery ensure that these demanding workloads are handled with ease.

High-Performance Content Creation & Rendering: For content creators and 3D artists, the RocketStor 8631CW provides the performance needed for real-time rendering and high-resolution video editing, making it an invaluable tool for professional media production.

High-End Gaming, VR, and Media Platforms: Gamers and VR enthusiasts can benefit from the RocketStor 8631CW’s ability to support the latest, most powerful GPUs, providing a smooth and immersive gaming experience.

Scientific Research & Big Data Analytics: Researchers and data scientists can leverage the RocketStor 8631CW to accelerate complex computations and data analysis, enabling faster insights and more efficient research processes.

Enterprise Workloads & Cloud Computing: The RocketStor 8631CW is also suited for enterprise environments, where it can enhance the performance of cloud computing platforms and support the demanding workloads of large-scale data centers.

In Summary
HighPoint RocketStor 8631CW is a game-changing eGPU expansion solution that breaks free from the constraints of traditional connectivity options. With its industry-leading PCIe Gen5 switching technology, robust CopprLink CDFP connectivity, and advanced cooling system, it delivers unparalleled performance and reliability for a wide range of applications. Whether you’re a professional media creator, a data scientist, or an enterprise IT manager, the RocketStor 8631CW is designed to meet your most demanding GPU expansion needs.

Learn More
RocketStor 8631CW – PCIe 5.0 x16 GPU Expansion Chassis
https://www.highpoint-tech.com/rs8631cw-individual-page