Dell unveils its first-ever earbuds Dell Pro Plus Earbuds

Dell Technologies today announced the launch of the Dell Pro Plus Earbuds (EB525), the latest addition to its AI-based audio portfolio. Designed for the hybrid workforce, these earbuds combine advanced noise filtering, voice optimisation, and all-day comfort for employees, while offering IT teams simplified deployment and centralized manageability.

As the world’s first earbuds to earn Microsoft Teams Open Office Certification, the Dell Pro Plus Earbuds are engineered to empower effortless communication – whether users are commuting, working remotely, or jumping into a quick Teams meeting. Certified for both Microsoft Teams and Zoom, they ensure flawless compatibility with the most widely used professional collaboration platforms.

Consumer-grade audio gear often falls short in professional settings, where background noise and poor mic quality can derail productivity. The Dell Pro Plus Earbuds address these challenges head-on. With an AI-powered noise-cancelling microphone trained on over 500 million noise samples, adaptive active noise cancellation (ANC), and enhanced transparency mode, they deliver a premium audio experience tailored for the modern workplace.

Built for real-world conditions, the earbuds come with four ear tip sizes (XS, S, M, L) for a custom fit. Seamless multi-device connectivity is enabled through Bluetooth 5.3 and multi-host pairing, while the Dell Pair feature ensures quick setup and smooth transitions. A compact charging case with a built-in Dell Wireless USB-C Audio Receiver adds on-the-go convenience.

For IT administrators, the Dell Display and Peripheral Manager (DDPM) provides a centralised dashboard for easy device configuration, firmware updates, and fleet oversight – streamlining audio management across hybrid teams.

The Dell Pro Plus Earbuds are designed to empower effortless communication, whether you’re commuting, working remotely or jumping into a quick Teams meeting. said Mike Alessi, Senior Director of Global Product Planning & Launch for Collaboration Peripherals at Dell Technologies. “The future of workplace audio isn’t just about cutting-edge technology, it’s about empowering teams to connect, collaborate and thrive. With the Dell Pro Plus Earbuds, that future is here.”

The Dell Pro Plus Earbuds are available through Dell Technologies and authorised partners in Ireland and across EMEA at a recommended retail price of €245.99 including VAT. Find out more at: www.dell.ie.

 

See our earbuds reviews 

MyNetDiary Launches PlateAI: The First All-in-One AI-Powered Diet & Wellness App

MyNetDiary, the top-rated nutrition and health platform trusted by over 28 million users, has announced PlateAI, the next-generation app for personalized diet and wellness. Available on iOS and Android, PlateAI combines instant AI-powered food logging, real-time nutrition coaching, and a complete suite of health tools to support better habits and sustainable results.

Built for everyday life, PlateAI helps users eat better, lose weight, and stay consistent without the guesswork. The app combines the proven tracking accuracy of MyNetDiary with a powerful new AI Coach that delivers instant feedback, motivation, and meal guidance tailored to the user’s goals.

PlateAI is your complete support system,” said Sergey Oreshko, founder and CEO of MyNetDiary. “We’ve taken the trusted foundation of MyNetDiary and layered in real-time coaching, photo-based and voice logging, and a smarter, more connected experience. It’s everything people need to succeed in one app.”

With PlateAI, users can log meals by speaking, snapping a photo, or scanning a menu, with AI instantly estimating nutrition and offering personalized suggestions. Its verified database of over 1.9 million foods includes both macronutrients and 100+ micronutrients, providing unmatched nutritional depth.

Unlike traditional calorie counter apps that rely on tedious manual entry and a do-it-yourself approach, or basic AI apps that offer quick logging but limited features and generic guidance, PlateAI delivers both speed and depth. It combines fast AI-powered logging with a comprehensive wellness platform — and goes further with a real-time AI Coach that provides personalized, actionable support based on each user’s habits, goals, and progress, making it easier to stay consistent and achieve sustainable results.

Key Features Include:

  • AI Coach (24/7) – Personalized, real-time guidance, encouragement, and insight based on actual user data
  • Photo & Voice Logging – Track meals instantly without typing or manual entry
  • Menu Scan & Smart Suggestions – Scan a restaurant menu and get personalized recommendations
  • Daily Nutrition Feedback – Trend analysis and suggestions to help users adjust in real time
  • Custom Meal Plans – Expert-designed programs for weight loss, strength, or balanced eating
  • 1.9M+ Verified Foods – Complete with macros and 100+ micronutrients
  • Intermittent Fasting Tools – Flexible tracking and support
  • Fitness Integration – Works with Fitbit, Garmin, Apple Health, Google Fit, and more
  • Built-in Community Support – Ask questions, share meals, and get encouragement

Built on two decades of trusted nutrition expertise, PlateAI delivers an elevated, AI-driven experience to support better health, improved consistency, and sustainable results all in one place. Learn more at PlateAI.com.

What Role Does Data Play in Building Effective Multimodal AI Systems?

Data drives every layer of effective multimodal AI systems, making it essential for connecting information from text, images, audio, and beyond. These systems thrive on high-quality, well-annotated, and diverse datasets that enable more accurate understanding and integration across multiple data types. With AI-powered computer vision development, visual data can be transformed into actionable insights, broadening the reach and impact of multimodal AI functions.

As businesses look to innovate, the ability of multimodal AI to process varied data inputs is vital for real-world applications. Proper data strategy shapes not only how AI perceives information but also determines the quality and reliability of outputs in practical environments.

Key Takeaways

  • Data quality and diversity are critical for multimodal AI.
  • Cross-modal data integration enables sophisticated analysis.
  • Real-world performance depends on robust data-driven strategies.

The Foundation of Data in Multimodal AI Systems

Data is essential in training multimodal AI, as it allows systems to integrate language, visual, and audible information. By leveraging diverse and high-quality data, these systems can achieve greater accuracy and adaptability in real-world applications such as medical imaging, sentiment analysis, and image captioning.

Types of Data Used in Multimodal AI

Multimodal AI systems utilize a mix of data from different sources. Common data types include images, text, audio, and video. For example, computer vision leverages visual data, while natural language processing handles textual information. Speech recognition and sentiment analysis benefit from both audio and text.

This integration helps machines learn relationships between modalities. In generative AI and deep learning, handling multi-modal data such as audio-visual clips or paired text-image samples is crucial. Large language models often use a combination of structured and unstructured data to enhance their capabilities. Popular multimodal datasets include the Flickr30K and COCO datasets, which offer paired images and captions for robust model training.

Importance of Data Quality and Diversity

Effective multimodal learning depends on both the quality and diversity of the datasets. High-quality data minimizes errors and ambiguities, ensuring that multimodal models perform consistently across tasks like image captioning and medical imaging. Diverse data, including different languages, accents, visual contexts, and environmental noises, supports the model’s resilience and adaptability.

If one data channel is noisy or missing, a multimodal system can rely on another for context. Well-curated, balanced datasets reduce biases and improve reliability in applications such as AI healthcare and generative AI multimodal AI systems are also more robust when drawing from varied and representative sources.

Multimodal Datasets and Benchmarking

Benchmarking multimodal AI requires comprehensive datasets that cover multiple types of input. Widely used resources like the COCO dataset and Flickr30K dataset mix paired images and text, supporting advanced tasks in image captioning and visual question answering.

These multimodal datasets serve as standard benchmarks for comparison across different deep learning models. Organized benchmarking allows researchers to systematically evaluate performance across various AI applications, from sentiment analysis to computer vision, multimodal datasets have been especially valuable for medical imaging tasks and emerging large language models. Regular benchmarking encourages the development of more accurate and generalizable AI systems.

Data-Driven Strategies for Building Effective Multimodal AI Systems

Developing robust multimodal AI systems demands more than just collecting information. Quality, integration methods, learning strategies, and safeguards for privacy and security are fundamental for performance across real-world tasks like recommendation systems, object detection, and diagnosis.

Data Integration and Fusion Techniques

Effective multimodal AI relies on data integration and fusion to combine signals from diverse sources such as text, images, audio, and video. Early fusion merges input data at the raw stage, enabling neural networks like convolutional or recurrent neural networks to learn joint representations. This approach works well for closely related or synchronized data streams.

Late fusion processes each modality separately before merging high-level features, which is key when dealing with weakly correlated or asynchronous data. Stacking and random forests are often used for late fusion in classification tasks. Combining data using these techniques is critical in sectors like healthcare for integrated diagnosis, or in self-driving cars where visual and sensor data must be fused.

Learning Approaches for Multimodal AI

Multimodal AI systems benefit from flexible machine learning strategies tailored to diverse data. Supervised learning remains central, training neural networks such as convolutional and recurrent models on labeled modalities. However, self-supervised and contrastive learning approaches are growing, utilizing unlabeled data to learn robust latent representations. For example, contrastive loss forces systems to associate related data (like matching image and caption pairs), enhancing cross-modal retrieval and recommendation systems.

Probabilistic models can be used to handle uncertainty in input processing, especially when modalities might be noisy or incomplete. Diffusion models, another neural approach, help generate synthetic data to supplement limited training sets, improving object detection and action recognition tasks.

Conclusion

Data is essential for building robust multimodal AI systems. It enables the integration of varied input types—such as text, images, and sensor data—which leads to more capable and context-aware models. Effective use of data allows these systems to learn relationships across different modalities. This results in improved accuracy and adaptability in real-world applications.

Well-curated and diverse datasets are key for ensuring performance and reliability. The quality, completeness, and integration of data sources directly impact how well multimodal AI can function in practical scenarios.

 

Dell Technologies collaborates with Crann Centre to harness AI for social good

Dell Technologies has today announced that it has teamed up with the Crann Centre, a Cork-based charity, to develop an AI-powered solution that enhances care for children, adults and families living with neuro-physical disabilities. This collaboration has resulted in the development of a bespoke AI-powered intake application that reduces the administrative burden on Crann staff, streamlines intake processes, and enhances organisational efficiencies and service delivery.

The collaboration began as a local volunteering connection, and it has since evolved into a relationship that harnesses the power of AI to support how care is delivered to families living with neuro-physical disabilities. Dell Technologies’ Global Presales team worked closely with Crann to streamline their intake process, reducing processing time by 33%, enhancing data capture, and improving the overall experience for families.

The solution has transformed how Crann performs its client intake appointments, improving the consistency and quality of data captured. The final step, currently in progress, will be the full integration of the app with Crann’s Salesforce system to ensure a single and reliable data source.

With fewer administrative burdens, the Crann team can now devote more time to delivering personalised support, strengthening relationships, and improving care outcomes. This collaboration is a testament to how AI can be used for societal good, increasing Crann’s capacity to serve more families without requiring additional resources.

Speaking about the collaboration Des O’Sullivan, Vice President, Dell Technologies Customer Solution Centres said “At Dell Technologies, we believe innovation truly matters when it drives meaningful change in people’s lives. Through a shared commitment of making a difference, our team at Dell Technologies collaborated with Crann to develop an AI-powered solution keeping in mind Crann’s deep-rooted commitment to family-centred wraparound care.

“With the AI-powered solution that our team helped to create, Crann has increased capacity, allowing them to serve more families than requiring additional resources. The benefits extend far beyond efficiency; Crann team members now have the time and space to focus on deep, meaningful interactions with clients and their families, strengthening emotional and practical support.

“As we look to the future, we’re proud of what has been achieved. Our Dell Technologies team in Ireland and our broader Global Presales team has been at the heart of this journey, bringing Dell’s AI innovation to life in a way that delivers real and lasting value to our community partners.”

The Dell-built solution is designed with future scalability in mind, offering a framework that can be adapted across sectors such as education and customer service.

Crann, which offers wraparound services focused on improving independence and wellbeing, now has increased capacity to deliver support that spans generations underpinned by a shared commitment to personalised care and innovation.

Irish Femtech innovator Joii Launches AI-powered Menstrual Monitoring Solution

Joii, a trailblazing Irish femtech startup which has raised €2.4m in funding to date, has today launched its groundbreaking app and innovative range of sustainable period care products, aiming to transform the way menstrual health is understood, monitored, and discussed in Ireland and beyond.

A world-first, Joii’s menstrual pad and mobile app measure period blood volume and clot sizes, offering users unprecedented insights into menstrual health through AI-powered image analysis. Backed by clinical research and real-world validation, Joii offers a data-driven way to monitor menstrual health, finally giving people with periods the tools to see what’s really going on.

While wearables track steps, sleep, and heart rate, Joii is pioneering a long-overdue frontier in female health, providing data-driven insights into the menstrual cycle and female health. By combining specially designed evaluation pads with a mobile app, users can scan used pads to receive a quantified evaluation of their period flow and clotting patterns. These data points are clinically relevant but often overlooked.

“For too long, women have been told to just ‘track their period’ without any real tools to measure what’s actually happening. With Joii, we’re changing that,” said Joii Founder, Justyna Strzeszynska. “We’re helping people see their periods clearly, for the first time.”

Joii is focused on setting a new standard for menstrual health and, by extension, women’s health issues in general. Heavy menstrual bleeding affects 1 in 3 women, yet diagnosis and treatment are frequently delayed due to vague language like “heavy” or “normal”, without any tangible or qualifying details. Joii introduces objective data to the conversation, empowering users with measurable, visual, and shareable insights that can accelerate diagnosis and care.

Joii’s technology is Class I Medical Device–Certified in the UK and protected under multiple patents. The company is now partnering with leading institutions to develop predictive models for menstrual conditions, such as endometriosis and polycystic ovary syndrome (PCOS), transforming menstrual blood into a powerful diagnostic tool.

Supported by clinical research and validated through real-world studies funded by the NIHR (National Institute for Health and Care Research), Joii products show remarkable results. 62% of users reported improved communication with their healthcare professionals, symptom screening efficacy increased by 288%, and menstrual health literacy rose by 134%, demonstrating the app’s positive impact on menstrual health awareness and management.*

This technology is particularly beneficial for individuals experiencing heavy bleeding, those on long diagnostic journeys, such as endometriosis, which currently can take up to 10 years, as well as those seeking a better understanding and control over their menstrual health. Joii empowers users to advocate for themselves and also provides healthcare professionals with objective data that can improve diagnosis and treatment.

What the Experts Say:

Dr Fatema Mustansir Dawoodbhoy (NHS), Clinical Advisor to Joii, said, “This app will definitely offer me better insights into how the patient is feeling as I will be able to understand their symptoms variation throughout the month.” Meanwhile, Dr Kushal Chummun, Consultant in Obstetrics and Gynaecology in the Rotunda and Connolly Hospital, agrees – “I think the app is really, really good”, underpinning the clinical confidence in Joii’s innovation.

How It Works:

  1. Wear Joii Pads – Designed for optimal visibility and comfort.

  2. Scan with the Joii App – Use the Joii App to analyse the pad.

  3. Get Insights – Instantly receive volume in millilitres, clot detection, and cycle trends.

 

Joii’s AI-enabled app is free to download on iOS and Android devices, and Joii Evaluation Pads RRP €6.95 are available now from selected health stores and pharmacies nationwide as well as online from www.joiicare.com.

“Wholefoods is proud to support Joii Care and their groundbreaking approach to menstrual health,” said Ronan O’Flynn, Sales Manager at Wholefoods Wholesale Ireland. “By combining smart technology with a clinically tested menstrual pad, Joii empowers users to measure their period volume and better understand their symptoms – a world-first solution that supports more informed self-care and stronger communication between patients and clinicians.”

AWS x F1 Launch AI-Powered Track Design Experience

Formula 1 and Global Partner AWS have announced the launch of a brand-new interactive digital experience, ‘Real-Time Race Track’, which allows fans to create, customise, and share their own F1 track design. As part of the experience, they can also enter a sweepstake for a chance to win a trip to the FORMULA 1 BRITISH GRAND PRIX 2026.

Using AI-powered analysis from Amazon Nova, ‘Real-Time Race Track’ enables fans to design an original, custom race track of any shape and length using their computer’s mouse or by tracing their finger on any touchscreen device.

Following the completion of the circuit, each turn and straight on the track is analysed by Amazon Nova, which produces key on-track metrics including top speed and projected lap time, as well as two viable race strategies, further evaluating the optimal pit timing, tyre recommendations and tactical adjustments for various weather scenarios. The detailed level of data across the experience offers fans an insight into the world of Formula 1 team strategists and creates an authentic strategic dimension to each custom circuit design.

After creating and submitting a track, fans can enter a sweepstake to win a trip to the FORMULA 1 BRITISH GRAND PRIX 2026, with the winner being selected at random, providing the opportunity to see first-hand a variety of strategies from the teams across the race weekend. The draw for the sweepstake will close on 16 July, 2025.

The latest experience builds upon the strategic Partnership between Formula 1 and AWS, which began in 2018. Over the past seven years, the collaboration has consistently delivered innovations that elevate the on-track competition, as well as the off-track experience for fans. With more than a million data points per second coming off the cars, the cornerstone of Formula 1’s partnership with AWS is the ability to extract valuable insights from all the data, and the ‘Real-Time Race Track’ experience applies that knowledge combined with AWS’s advanced analytics and artificial intelligence for the benefit of fans.

For more information on the ‘Real-Time Race Track’ experience, click here.

Jonny Haworth, Director of Commercial Partnerships, Formula 1 said:

“Our ongoing partnership with AWS continues to evolve and transform how fans interact with Formula 1. The ‘Real-Time Race Track’ experience exemplifies how we’re using cloud technology and AI to bring fans closer to the sport than ever before. As we celebrate our 75th anniversary, we’re giving fans an inside look at the complexities and innovation of race strategy, using the same technology that helps to power our sport.”

Kristin Shaff, Global Director of Strategic Partnerships, AWS said:

“When we first began working with Formula 1, they presented us with a unique challenge – how to use telemetry data to further engage fans during live races. That vision has since materialized into 23 data-driven F1 Insights that appear during the broadcast to help fans better understand how teams devise strategies. With today’s launch of the Real-Time Race Track experience, we’re taking this approach to a new level of interactivity.  Now anyone can design their own circuit and instantly see how weather conditions, track configurations, pit timing, and tire selection influence performance.”

 

Advise AI investment surpasses €8M as it launches GenAI agent for brands

Advise, the automated data, analytics and AI platform provider helping retail manufacturers and brand-owners to unlock revenues and margin growth, today launches a new GenAI-powered agent. The expert GenAI assistant has been trained by Advise engineers to enhance customer decision-making, giving them competitive advantage. It represents an €8M investment by the Dublin-based company in the development of its AI platform.

Advise’s GenAI agent is a major update to the company’s AI-powered SaaS platform and can help brands to reach actionable insights 10X faster than previous methods. The platform collates and harmonises sales, inventory and customer data from multiple sources into a single platform. The Advise analytics engine filters out data noise, identifies key patterns and presents the most critical information in real-time. The customised GenAI agent further enhances this process by translating complex analytics into clear, natural language explanations and recommendations, turning statistical outputs into human-digestible insights.

The latest release was created by Advise’s Dublin-based engineering team, using a customised and trained large language model (LLM). While more widely available LLMs are typically limited to chatbot-style interactions, Advise’s agent moves beyond this. The GenAI agent has been trained to augment the role of the category manager – the expert responsible for understanding product performance and optimising new product, pricing and promotion strategies. In doing so, the platform empowers consumer-packaged goods (CPG) brands such as Kerry, Dr. Oetker, Pilgrim Food Masters, Tayto and Britvic, to take decisive action that will result in revenue growth.

Advise has programmed the customised GenAI agent to work from a structured set of instructions and best practices. With the ability to personalise its output according to each customer, it can detect market shifts, anticipate trends and deliver guided and relevant insights tailored to business needs. The agent also ranks, through a News Feed on the Advise platform, its insights based on relevance and urgency, ensuring users receive tailored, high-value recommendations on their next steps.

Kevin McCarthy, CEO, Advise, said: “Many businesses are still focusing on data collection and analytics, but we have moved beyond that. We have automated data processing and evolved GenAI to augment and act as a valuable assistant to category managers; turning data into insights and decisive actions that drive margin growth and competitive advantage.

“At Advise, we believe that the future of category management isn’t just about gathering more data; it’s about making smarter decisions, faster. Our new GenAI agent doesn’t just analyse; it anticipates. It learns, adapts, and prioritises what matters most for each brand, providing tailored recommendations that translates directly into growth. It enables users to move beyond spreadsheets and dashboards to proactive decision-making.”

 LLMs are hugely powerful and are impacting every industry, but their output is non-deterministic and they suffer from creative hallucinations. This can make them unsuitable for numeric and statistical analysis unless carefully managed. What we have created is the next generation of CPG decision-making: where you don’t just gain trustworthy information, but a strategic advantage using GenAI. In a world where speed and precision define success, this is how the best brands will stay ahead.”

London Bus Network to benefit from AI-powered Optimisations through CitySwift and London Bus Operators Partnership

London has one of the largest public transport networks in the world with nearly 9,000 buses serving approximately 1.8 billion passenger journeys annually. It is operated by seven bus operators on 675 routes franchised by Transport for London (TfL). CitySwift has the potential to over time enable all of London’s Bus Operators to use data, AI optimisations and simulations to mitigate disruptions, optimise resource use and facilitate collaboration. Increasing the performance of the network will deliver more reliable bus services to drive increased passenger volumes and a better passenger experience.

“Our work with both authorities and operators in multiple regions since 2016 has consistently delivered tangible results in improved reliability, efficiency, and passenger and driver satisfaction. We are thrilled to bring these advantages to all operators, drivers and passengers in the London bus network, contributing to TfL’s vision of providing the most efficient and reliable transport network,” said CEO and Co-founder Brian O’Rouke on today’s announcement.

CitySwift currently employs 17 people in the UK and 50 people in Ireland, with ambitions to double headcount in both countries over the next 3 years. Founded in 2016 by Brian O’Rourke and Alan Farrelly, CitySwift empowers private operators and public transport authorities to achieve unmatched efficiency by using data to solve problems. CitySwift’s AI-powered performance optimisation platform delivers insights, simulations, and actionable recommendations to support the provision of high-performing bus services.

This news follows a year of rapid growth for CitySwift in 2024. During that time, the company secured €7 million in funding and announced renewed and new partnerships with Transport for Wales, National Express, Transport for Greater Manchester, Go-Ahead Group, and more.

Google-led initiative to support Irish businesses returns

Google is today calling for entries to the third annual You’re the Business Competition. Delivered in partnership with Enterprise Ireland (EI) and the Local Enterprise Offices (LEO) , You’re the Business is an online platform offering digital training and tools to businesses in Ireland free of charge. The competition element will reward businesses that have demonstrated a commitment to digital at different stages of their journey with those selected set to receive a Google digital support package, one-to-one mentoring and the chance to win a bespoke, AI-powered advertising campaign.

The competition is open to businesses across Ireland. Entrants must submit a written entry no longer than 200 words telling their story around how they’ve utilised or enhanced digital skills in order to help their business grow or succeed online.

The judging panel, comprising representatives from Google, EI, and the LEO’s with You’re the Business alumni and past winner Sarah Timony of ADAPTAFASHION will assess the entries based on their response to the prompt ‘Tell us why you’re the business’. A PDF must be uploaded to the competition submission box on the You’re the Business website by 21 March 2025.

This year’s initiative takes on a new AI-focus. The You’re the Business online platform has access to new training modules where users can explore AI tools and learn how AI can help them prepare for the future. Selected businesses will receive a digital support package from Google which includes a one-year Google Workspace Business Standard subscription with access to AI-powered tools like Gemini Advanced and NotebookLM and a one-to-one consultation from Google experts on how to get the best out of Google Workspace. Winners will also receive a You’re the Business trophy and digital assets and will be invited to attend a dedicated Winners event in Google’s EMEA HQ in Dublin.

New this year, the company or founder that has demonstrated the most creative or effective use of digital skills to grow or enhance their business will receive a one-of-a-kind prize; a bespoke AI-powered advertising campaign, created by Google AI with a team of experts in partnership with the winner.

Cera Ward, Managing Director at Google Ireland  said:

“We are so excited to launch the third-annual You’re the Business competition and look forward to learning more about these companies, their challenges, and digital ambitions. We know that AI has the power to revolutionise the way we do business which is why we have made it a key focus of this year’s initiative. Google wants to support these companies as they take the next steps on their AI journey.”

 

Carol Gibbons, Divisional Manager. Entrepreneurship, Regions & Local Enterprise at Enterprise Ireland:

“Enterprise Ireland and the Local Enterprise Offices are committed to empowering Irish businesses to succeed online. Our collaboration with Google on initiatives like this competition is a key part of that commitment. We are thrilled to partner with Google on this exciting opportunity and look forward to discovering the innovative ways businesses are approaching their digital growth. We’re here to support them every step of the way on their digital journey.”

John Magee, Chair of the Local Enterprise Office Network said:

“We are committed to helping Irish businesses thrive in a digital world, and this initiative, delivered in partnership with Google, is just one of the many ways we continue to support our clients on their digital journey.”

Speaking at the launch announcement, past winner and judge for the 2025 competition Sarah Timony [pictured], CEO and Founder, ADAPTAFASHION:

“Participating in the You’re the Business competition was a fantastic experience. The programme provided me with invaluable support and resources that directly contributed to the growth and success of ADAPTAFASHION. I’m so grateful for the opportunity and would encourage fellow founders to participate in this year’s initiative to see how far digital can bring you on your growth journey.”

Entries are open now with further information on the You’re the Business competition available at: g.co/yourethebusiness