PixelCable Pro 240W App-Controlled Magnetic Charging Cable with Smart Display

Bored of your standard charging cable? Fear not mobfree are back with a new iteration of their Pixel Cable and it is called Pixel Cable Pro this time faster and better.

Offering a bigger display faster charging and data transfer this one is a must have for those who took up the first version we reviewed.

PixelCable Pro is equipped with the latest USB Power Delivery 3.1 Technology,boasting a nominal maximum power transfer capability of 240W. Equipped with the E-mark smart chip, PixelCable Pro guarantees safe and rapid charging for a diverse array of devices, ranging from wireless earbuds, smartphones to power-hungry laptops. It can charge your MacBook Pro 16″ to 56% in just 30 minutes, effortlessly giving the full potential of all your USB-C devices.

This advanced chip facilitates optimal power flow, effectively adjusting the current to match your device’s specific requirements. So not only does your device charge swiftly, but it also stays protected against potential damage from overcharging or overheating.

The OLED display gives at-a-glance info on your phone’s actual battery level without having to wake up the screen. Keep tabs on power with the clear display and get on with your day.

QCharge App

PixelCable enables fast data transmission up to 480mbps. 1G file can be transferred within 24 seconds.Short or long, you can curl PixelCable Pro to any size you want. Simply extend it for full length (5ft cable) or coil it to save space — the choice is yours. This built-in magnetic coil technology allows the cable to almost wrap itself into a neat and tidy shape that will stay wound without the need for velcro straps or other restraints.

Everytime when you connect your device, PixelCable Pro will greet you with start-up animation. You can choose the start-up animation from the preset images. The movement speed and direction can also be changed. You can also upload your own images to DIY your customised start-up animation.

This magnetic charging cable is suitable for all occasions, whether at home, in the office or in the car. Unlock more ways to use. The possibilities are endless.

More mobfree reviews

BUY

Video Review

HP Wolf Security Uncovers Evidence of Attackers Using AI to Generate Malware

HP has issued its latest Threat Insights Report revealing how attackers are using generative AI to help write malicious code. HP’s threat research team found a large and refined ChromeLoader campaign spread through malvertising that leads to professional-looking rogue PDF tools, and identified cybercriminals embedding malicious code in SVG images.

The report provides an analysis of real-world cyberattacks, helping organisations to keep up with the latest techniques cybercriminals are using to evade detection and breach PCs in the fast-changing cybercrime landscape.  Based on data from millions of endpoints running HP Wolf Security, notable campaigns identified by HP threat researchers include:

  • Generative AI assisting malware development in the wild: Cybercriminals are already using GenAI to create convincing phishing lures but to date there has been limited evidence of threat actors using GenAI tools to write code. The team identified a campaign using VBScript and JavaScript believed to have been written with the help of GenAI. The structure of the scripts, comments explaining each line of code, and the choice of native language function names and variables are strong indications that the threat actor used GenAI to create the malware. The attack infects users with the freely available AsyncRAT malware, an easy-to-obtain infostealer which can record victim’s screens and keystrokes. The activity shows how GenAI is lowering the bar for cybercriminals to infect endpoints.
  • Slick malvertising campaigns leading to rogue-but-functional PDF tools: ChromeLoader campaigns are becoming bigger and increasingly polished, relying on malvertising around popular search keywords to direct victims to well-designed websites offering functional tools like PDF readers and converters. These working applications hide malicious code in a MSI file, while valid code-signing certificates bypass Windows security policies and user warnings, increasing the chance of infection. Installing these fake applications allows attackers to take over the victim’s browsers and redirect searches to attacker-controlled sites.
  • This logo is a no-go – hiding malware in Scalable Vector Graphics (SVG) images: Some cybercriminals are bucking the trend by shifting from HTML files to vector images for smuggling malware. Vector images, widely used in graphic design, commonly use the XML-based SVG format. As SVGs open automatically in browsers, any embedded JavaScript code is executed as the image is viewed. While victims think they’re viewing an image, they are interacting with a complex file format that leads to multiple types of infostealer malware being installed.

Val Gabriel, Managing Director of HP Ireland, comments: 

There has long been speculation about AI being used by attackers, but evidence has been scarce, so this finding is significant. Typically, attackers tend to obscure their intentions to avoid revealing their methods, so this behaviour indicates an AI assistant was used to help write their code. It’s cases like this that showcases threat actors are constantly updating their methods. Instances like this one further lower the barrier to entry for threat actors, allowing novices without coding skills to write scripts, develop infection chains, and launch more damaging attacks. So, businesses must build resilience, closing off as many common attack routes as possible and adopt a defence in depth strategy to mitigate any risks.”

By isolating threats that have evaded detection tools on PCs – but still allowing malware to detonate safely – HP Wolf Security has specific insight into the latest techniques used by cybercriminals. To date, HP Wolf Security customers have clicked on over 40 billion email attachments, web pages, and downloaded files with no reported breaches.

The report, which examines data from calendar Q2 2024, details how cybercriminals continue to diversify attack methods to bypass security policies and detection tools, such as:

  • At least 12% of email threats identified by HP Sure Click bypassed one or more email gateway scanners, the same as the previous quarter.
  • The top threat vectors were email attachments (61%), downloads from browsers (18%) and other infection vectors, such as removable storage – like USB thumb drives and file shares (21%).
  • Archives were the most popular malware delivery type (39%), 26% of which were ZIP files.

HP Wolf Security[i] runs risky tasks in isolated, hardware-enforced virtual machines running on the endpoint to protect users, without impacting their productivity. It also captures detailed traces of attempted infections. HP’s application isolation technology mitigates threats that can slip past other security tools and provides unique insights into intrusion techniques and threat actor behaviour.

Getting to Grips with Mobile Gametech

Wow, has mobile gaming changed. Over the decades, it’s transformed from peripheral entertainment niche into enormously significant and lucrative sector within both the mobile app and gaming ecosystems. For several years now, mobile games have generated over half of all global gaming revenues, outperforming both the incumbent console and PC markets – impressive in our eyes, at least!

According to a recent report by Statista, the mobile gaming industry is forecast to hit a value of over $118 billion in 2027 – truly underscoring its dominant status in gaming. 

And the success of mobile gaming is no mere trend! It’s a result of technology utterly transforming the gaming landscape. Exciting advancements in tons of different areas – mobile hardware, software development, and operational tech – have rewritten our sense of what’s possible. Nowadays, you can use your phone to enjoy the kind of experience that you’d once only have expected on high-end consoles and PCs. Plus, have you seen the sheer scale of games available for mobile these days? Cards, bingo, slots, poker, adventures, horror, and more. Users can access everything from casino games to esports tournaments on their smartphones. Want to try your hand at slots or poker without having to do more than pick up your phone and open an app? You can enjoy super high-end graphics and gorgeous functionality in today’s mobile casino games, all from your smartphone! It’s one very clear marker of just how far we’ve come.

And sure, there are plenty of other genres that your mobile can handle, but there’s no question that digital casino games are especially well-suited to this format – supported by their popularity today.

The Mobile Gametech Ecosystem

To fully grasp how mobile gaming thrives, it’s crucial to dissect the technologies that sustain it. The mobile gametech ecosystem can be broadly categorised into three domains: Development, Consumer Tech, and Operations. 

Development

Development is at the heart of any gaming experience, and for mobile developers, the landscape is more competitive than ever. From casual commute-friendly games to AAA multiplayer titles, there’s an ever-increasing demand for new mobile experiences, which makes development tech that can handle the demand essential. 

Consumer Tech

Of course, all the development technologies in the world wouldn’t make a difference if it weren’t for devices. What else would we play the games on? The latest smartphones now come equipped with faster processors, stronger GPUs, and RAM aplenty, making it possible for developers to create games that were once only feasible on games consoles. 

Connectivity also takes a pivotal position in the mobile gametech ecosystem. As controversial as it remains, 5G has been a boon for mobile gamers, offering low latency and lightning-fast download speeds. 

Operations: Monetization, Data, and Maintenance

The operational side of mobile gaming is also crucial to the ecosystem. This includes everything from backend infrastructure to player engagement techniques.

A Deep Dive into Mobile Game Engines

Most app developers today are increasingly turning towards using game engines. It will surprise absolutely no one that using the right software to develop a gaming app is key, but did you know it can make the difference between a mediocre title cluttering up the app stores and a standout hit everyone wants to try? Developers need engines that aren’t just robust, but also flexible enough to handle the unique challenges of the mobile platform – which are not insignificant! 

What is a Mobile Game Engine?

A mobile game engine is essentially a software framework designed to simplify game development. These frameworks provide the necessary tools to create, manage, and optimise all the elements of a game – not just graphics, but also sound and networking. When creating mobile gaming apps, a game engine can be used to render graphics and manage the game’s underlying logic, which ensures a seamless experience across all mobile devices. 

The Best Mobile Game Engines

There are several game engines in existence and, while none are exclusively designed solely for mobile development, a number do stand out for their adaptable feature sets, which make them ideal for the sector. The likes of Unity and Unreal are the market leaders, but don’t stop with them. There are also a couple of up-and-coming tools that deserve your consideration!

Unity

Perhaps the most popular engine for desktop and console games, but that’s not all! It also ranks highly in the mobile world. If you’re creating 2D or 3D mobile games, it’s excellent, and it’s also versatile and super easy to use. Plus, cross-platform capabilities! They mean that developers can build for Android, iOS, Windows, and more at the same time – definitely a win. 

Unreal Engine

Unreal Engine, developed by Epic Games, counts as another major player here. Sure, it’s better known for AAA console and PC games, but does that mean it’s useless for mobile games? No! Its powerful visual scripting tool called Blueprints lets developers build games without needing to write a single line of code, opening up options for folks everywhere. 

Godot

Free and open-source, Godot is swiftly gaining popularity. Like the engines above, it supports 2D and 3D game development, but it’s much more lightweight than Unreal, which makes it ideal for creating mobile games that are compatible with less powerful devices. 

Cocos2d-x

Popular among developers creating 2D and indie games, Cocos2d-x is a lightweight, open-source engine. It’s widely used in the Asian markets, particularly during the development of hypercasual and puzzle games. However, thanks to its support for Android, iOS and Windows, it’s becoming more popular with developers in the UK and Europe. 

 

Efficient Data Retrieval: Optimizing API Requests for Developers

In today’s software development atmosphere, there is a lot of integration and the need to pull information from different external sources. Application Programming Interfaces (APIs) help different software systems to communicate, thus making it easy to fetch information when needed. However, developers also have to remember that API requests should be executed very efficiently to maintain the performance of the system. This article provides a clear explanation of how APIs make accessing and collecting information much easier. It will also look at the use of a curl get request and how it can be used for data retrieval.

What is API?

API allows communication between two programs and provides the logic that will operate between them. It also provides a blueprint for software applications, as it stipulates rules regarding the usage of various software.

Significance of Maximizing Data Extraction from APIs

Providing strategies for faster data retrieval from APIs is critical for system performance, minimizing the waiting time, and for effective use of resources by putting them to the most efficient use. Developers face unforeseen costs regarding data usage because of inefficient API adoption, which increases the data retrieval duration and the total volume of data that will be retrieved. 

Best Practices In Requesting And Cascading The Management Of Queries

It is essential to learn and understand as many best practices as possible, especially when optimizing API requests.

Grasp the API Endpoints and Parameters

In making API requests, developers have to do due diligence on the provided endpoints and parameters. A friendly approach towards how API structure works and how data is drawn makes the data calling more reliable.

Apply Relevant HTTP Methods

For prompt and effective data extraction, it is very important to select the appropriate HTTP methods (GET, POST, PUT, DELETE) that support the expected activity. GET requests are most efficient in pulling data out, while POST and PUT requests are used to put in or change the data, respectively.

Make Use of Pagination for Big Data Acquisition

In retrieving bigger data sets, pagination helps the developer obtain the data in a few divisions. This ensures that the system is not stressed and that data is processed more effectively.

Make Use of Suitable Authentications

Authentication methods can help combat data threats and prevent data tampering. Therefore, developers need to utilize effective authentication methods, starting with APIs and moving on to OAuth. 

Reasonable Error Handling and Retries

Mistake management and risk assessment have been termed core components in the rendering of API requests. By applying sound error handles and reattempt requests, there’s an improved chance of obtaining data in the event of intermittent errors.

Things to Note in Dealing with Large Datasets

When handling large amounts of data, certain practices will ensure that retrieval is done optimally and efficiently.

Enable Data Streaming for Large Size of Information Carriage

With the use of a data streaming system, developers can send and process data at the same time and minimize the amount of memory utilized.

Use Data Compression Approaches

Data transfers, which involve compressive approaches, reduce the size of the transferred data, thereby optimizing the bandwidth and speed of retrieval of data transfer.

Managing Rate Limit Policy

API rate limiting policies must be respected to avoid violating the terms of abuse of third-party APIs and ceasing to obtain additional data.

Keep an Eye on the API Rate Limit and Respect It

Developers need to monitor and observe the policy’s provisions regarding API usage to preempt any halting of services or blocking of APIs. Adhering to rate limits encourages good business ethics from data providers and guarantees active data flow.

Make Use of Exponential Backoff

Exponential backoff is a programming practice that postpones further attempts to process requests after repeated rate limits, preventing servers from being flooded or throttled. By adopting strategies such as adding request parameters into the exponential backoff requests, request retries can be efficiently and effectively managed.

Investigate the Frequency of Requests and Tweak for Each API

Developers can also study the frequency with which requests are made and when those requests are unnecessary so that data can be fetched more intelligently rather than making too many API calls. By tweaking request frequency, developers will make systems more effective and data access processes more efficient.

The incursion of cURL GET Request

cURL is an efficient command line application designed to transfer data with URLs and multiple protocols which comes in handy for executing API calls in a command terminal.

Benefits of cURL for Fast Data Access

With cURL making all necessary send and receive operations as APIs, communication requests become unnecessary. This is very convenient for developers seeking to improve the efficiency of processes requiring data access.

In Conclusion

Developers looking to maximize how a system is operating and improve user satisfaction must focus on how data is retrieved from the system. Implementing various strategies like knowing API’s endpoints, figuring out how to handle large volumes of data, decreasing the time to completion, popularizing limiting rate features, and using tools like cURL make the process better. Apply these optimization strategies to improve your data access methods and enhance performance in your software development lifecycle.

 

Keeping Patient Data Safe: Why Cybersecurity Is Important in Medicine

Like most areas of our society, health care has wholeheartedly embraced the boom of digital technology. Computerised equipment and ‘smart’ medical devices have revolutionised patient care, and looking back on the last twenty years, the sorts of advancements that have come about are nothing short of outstanding. 

Of course, it’s not perfect. As is the case with any infrastructure that relies heavily upon technology, there’s always the concern of cyber security. In this article, you’ll learn about the main considerations medical institutions need to make. 

On Data Breaches

Given the vast amounts of personal, sensitive data that hospitals and medical centres deal with on a daily basis, they’ve become a prime target for cybercriminals

Whether it be stealing patient medical histories, financial records, insurance details, bank information, and more, hackers frequently seek to target hospitals for the immense value this sort of data has on the black market for use in fraud and ransom schemes. 

Thankfully, hospitals have now started to employ rigorous encryption methods to ensure patients are protected.

The Risk Involved With Medical Devices

While there wasn’t much concern even ten years ago, the leap in technological advancements seen in medical devices has become a hot topic where cybersecurity is concerned. 

More and more frequently, implantable devices and screening equipment are connected to the internet as standard; this can offer very valuable insight for researchers, but it comes at the added cost of potentially compromising cyber security. 

Aside from the obvious worrisome issue of personal data being leaked, there’s the much more serious implication of hackers being able to interfere with the actual mechanisms of these devices – a very dangerous precedent for patient safety. 

Thankfully, companies like Blue Goat Cyber exist: they work to secure medical devices from a cybersecurity perspective before they even hit the market.  

Training and Awareness in Cybersecurity


When we’re talking cybersecurity, it’s mostly all about letting the latest technology do the work. That doesn’t mean to say that human intervention isn’t crucial, however. 

Over the last several years, hospitals and medical centres have placed a huge focus on training their staff on how to safely handle sensitive and private data. This sort of training includes cyber hygiene (how to keep data organised and properly dispose of information no longer needed), how to distinguish fishing from regular email, and what steps to take to appropriately damage control in the unfortunate event that an attack does happen. 

Protecting against cyber attacks in a medical setting requires tight collaboration, as it can only take one weak link to have everything fall down like a stack of cards. Software and hardware – if properly maintained – is usually always rocksteady, so human error represents a key area for risk mitigation. 

Wrapping Up

While data breaches and cyberattacks in hospitals may be a scary prospect, with rigorous testing, thorough staff training, and the use of the latest cybersecurity software and hardware, the risks can be managed sufficiently enough that there isn’t a major cause for concern. Hopefully, you now have a better idea of how this standard can be accomplished. 

Clinical Trials in Rare Diseases: Overcoming the Barriers to Recruitment and Data Collection

Conducting clinical trials for rare diseases presents unique challenges that differ significantly from those for more common conditions. With limited patient populations, geographical dispersion, and unique clinical presentations, the path to gathering meaningful data and securing enough participants can be difficult. However, advancements in digital technology, innovative recruitment methods, and collaborations with patient advocacy groups are helping to overcome these barriers. For patients with rare diseases, these innovations represent hope for new treatments and therapies, often where none previously existed.

Barriers to Recruitment in Rare Disease Trials

One of the most significant barriers to conducting clinical trials for rare diseases is the small patient population. By definition, rare diseases affect fewer than 200,000 people in the United States, and many affect even fewer individuals. This limited pool makes it difficult to recruit enough participants to conduct statistically meaningful studies.

Geographical barriers also complicate recruitment efforts. Patients with rare diseases may be scattered across large regions or even different countries, making it challenging to bring participants to a central research site. Traveling long distances to participate in trials can be burdensome, especially for those who are already dealing with complex, debilitating conditions. Moreover, many rare disease patients may not be aware of the existence of clinical trials due to the limited public awareness and resources surrounding these conditions.

Additionally, the diversity of symptoms and disease progression patterns in rare diseases can make it harder to design standardized protocols that fit every patient’s experience. Researchers often need to adapt trial designs to accommodate these variabilities, but doing so can add complexity and time to the process. As a result, finding the right balance between inclusivity and specificity in participant criteria becomes a critical challenge. Utilizing clinical trial recruitment services can help address these challenges by connecting researchers with eligible patients more efficiently. These services also play a key role in minimizing geographic and logistical barriers to participation.

Innovations in Patient Recruitment

To address these challenges, technology-driven solutions are emerging to help improve patient recruitment in clinical trials. One such solution is Evidation, a digital health platform that uses real-world data to identify and engage potential trial participants. Evidation leverages data to find patients who may qualify for specific trials, allowing researchers to more effectively target recruitment efforts.

By using real-time health data and personalized insights, platforms help streamline the recruitment process, especially in rare disease trials where patients are often geographically dispersed and difficult to identify. This approach reduces the reliance on traditional recruitment methods, such as clinic-based outreach, which may not reach the full range of eligible participants. In turn, it increases the likelihood of recruiting a diverse, engaged, and representative patient pool.

Additionally, digital platforms help minimize the burden on patients by allowing them to participate in decentralized trials. Instead of requiring patients to visit a central research site, remote monitoring tools enable them to participate from their homes. This not only expands the geographic reach of trials but also makes participation more feasible for patients who might otherwise be unable to join due to travel or health limitations.

Addressing Data Collection Challenges

Data collection in rare disease trials can also be challenging due to the variability in disease symptoms and progression. However, innovations in wearable devices and mobile health apps are helping to collect real-time, continuous data, providing a more comprehensive picture of how treatments impact patients over time. These tools capture valuable information that might be missed during intermittent clinic visits, allowing researchers to track subtle changes in patients’ conditions that are critical to understanding treatment efficacy.

Patient registries and natural history studies are another valuable resource for rare disease trials. These databases collect information on patients with specific rare diseases, offering insights into disease progression and natural variability. By incorporating registry data into clinical trials, researchers can establish more accurate baseline measures and identify trends that may influence trial outcomes.

Collaborating with patient advocacy groups is another strategy that can significantly enhance data collection. These organizations are often deeply connected to their communities and can provide critical insights into patient experiences, challenges, and unmet needs. By involving advocacy groups in trial design and recruitment efforts, researchers can ensure that the trial reflects the real-world experiences of rare disease patients, leading to more meaningful and relevant data collection.

Conclusion

Overcoming the barriers to recruitment and data collection in rare disease clinical trials requires innovation, collaboration, and a patient-centric approach. The use of real-time monitoring technologies are transforming the way researchers identify and engage participants, making clinical trials more accessible to those with rare conditions. By embracing these technologies and working closely with patient advocacy groups, researchers can continue to push the boundaries of what’s possible in rare disease research, bringing new treatments and hope to those who need them most.

 

Park Place Announces New Liquid Cooling Solutions to Cut Costs and Carbon Emissions for Ireland Data Centres

Park Place Technologies, the world’s leading global data centre and networking optimisation firm, today announces the expansion of its portfolio of IT infrastructure services with the introduction of two Liquid Cooling solutions for data centres: Immersion Liquid Cooling and Direct-to-Chip Cooling.

This announcement comes at a critical time for businesses who are seeing a dramatic increase in the compute power they require, driven by adoption of technologies like AI and IoT. This in turn is driving the need for more on-prem hardware, more space for that hardware, and more energy to run it all – presenting a significant financial and environmental challenge for businesses. Park Place Technology’s new Liquid Cooling solutions present a strong solution for businesses looking to address these challenges as the technology has the potential to deliver strong financial and environmental results.

Direct-to-Chip is an advanced cooling method that applies coolant directly to the server components that generate the most heat including CPUs and GPUs. Immersion cooling empowers data centre operators to do more with less: less space and less energy. Using these methods, businesses can increase their Power Usage Effectiveness (PUE) by up to 18 times, and rack density by up to 10 times. Ultimately, this can help deliver power savings of up to 50%, which in turn leads to lower operation costs.

From an environmental perspective, liquid cooling is significantly more efficient than traditional air cooling. At present, air cooling technology only captures 30% of the heat generated by the servers, compared to the 100% captured by immersion cooling, resulting in lower carbon emissions for businesses that opt for immersion cooling methods.

Park Place Technologies can deliver a complete, turn-key solution for organisations looking to implement Liquid Cooling technology, removing the complexity of adoption, which is a common barrier for businesses. Park Place Technologies provides a single-vendor solution for the whole process from procuring the hardware, conversion of the servers for liquid cooling, installation, maintenance, monitoring and management of the hardware and the cooling technology.

“Our new Liquid Cooling offerings have the potential to have a significant impact on our customers’ costs and carbon emissions, two of the key issues they face today,” said Chris Carreiro, Chief Technology Officer, at Park Place Technologies.  “Park Place Technologies is ideally positioned to help organisations cut their data centre operations costs, giving them the opportunity to re-invest in driving innovation across their businesses.

The decision to invest in Immersion Cooling and Direct-to-Chip Cooling depends on various factors, including the specific requirements of the data center, budget constraints, the desired level of cooling efficiency, and infrastructure complexity. Park Place Technologies can work closely with customers to find the best solution for their business, and can guide them towards the best long-term strategy, while offering short-term results. This takes much of the complexity out of the process, which will enable more businesses to capitalise on this exciting new technology.”

Data loss and ransomware attacks among top cloud cybersecurity risks

A new survey from leading Irish IT managed services provider Auxilion reveals that data loss/theft and ransomware/malware attacks were the cybersecurity concerns most cited by IT leaders when it comes to the cloud – at 30% respectively.

The research, carried out by Censuswide and involving IT decision-makers across large enterprises in the Republic of Ireland, found that 40% of respondents see IT security risks as a main concern associated with adopting and managing cloud computing.

A similar proportion (42%) said that the changing cybersecurity landscape was one of the biggest obstacles to the successful delivery of their IT strategy. Moreover, one in four (26%) IT leaders in Ireland do not think current laws and regulations are sufficient to protect privacy, access, and confidentiality in a cloud-based environment.

Adding to this, almost a quarter (24%) of IT decision-makers surveyed who are currently using the cloud do not think their own organisation has sufficient capabilities to manage cloud computing and more than a third of those respondents admitted to having little or no visibility of their workloads in the cloud (36%).

Despite this, some 83% consider cloud to be a more secure approach for their organisation. It appears that IT leaders are being proactive in this area with 83% also having a cloud security strategy in place and 73% currently using a technology partner to manage their cloud strategy and services.

The study also found that the shift to cloud is set to continue with nearly all respondents (96%) expecting to migrate more workloads, applications, and processes to the cloud over the next 12 months.

On October 9th, Auxilion, HPE and Zerto will be holding an event hosted by broadcaster Ivan Yates to discuss the increased need for robust data protection, cloud security, and business continuity capabilities.

Donal Sullivan, CTO, Auxilion, said: “While the cybersecurity landscape is constantly evolving, organisations are facing an even bigger uphill battle at the moment with the rise of threats enabled by Artificial Intelligence and the introduction of the European-wide NIS2 regulation in October.

“This means businesses not only need to be more proactive when it comes to securing their data and responding to incidents, they also need to ensure that they are meeting their compliance and regulatory obligations. This requires the right technologies and partners that can support security, mobility and scalability.

“The truth is that in this day and age, resilience and recovery are as important as detection and prevention when it comes to cybersecurity. Businesses which fail to recognise this and adapt their strategy could be at risk operationally, reputationally and financially.”

Chris Rogers, Senior Technology Evangelist, Zerto, said: “Rapid recovery from a cyber incident is more than a reactive measure – it’s a critical component of a resilient and forward-thinking business strategy. The ability to swiftly bounce back from disruptions not only minimises downtime but also safeguards reputation, customer trust, and bottom line.

“The real competitive edge lies in turning these challenges into opportunities for growth and innovation, and partnering with experts to unlock advanced cyber resilience capabilities can significantly accelerate an organization’s journey to cyber maturity.”

Dell Technologies Unveils Major Updates to Data Lakehouse for AI Adoption

Dell Technologies has today announced significant performance and connectivity enhancements to its Dell Data Lakehouse platform. These new enhancements are designed to accelerate AI initiatives and streamline data access, providing businesses with fast query speeds, expanded data sources, simplified management and powerful analytics.

The key features of the Dell Data Lakehouse v1.1 includes enhance performance, improve connectivity, simplified management and expanded accessibility.

Turbocharged performance

New Warp Speed technology and high-performance SSDs boost query performance by 3x to 5x through automated learning of query patterns and optimising indexes and caches allowing businesses to extract insights from data faster than ever before.

 Improved connectivity

Dell Technologies has enhanced connectivity options by securely connecting to an existing Hive Metastore via Kerberos for seamless metadata operations and improved data governance. The new Neo4j graph database connector is now in public preview, and the Snowflake connector has been optimised for efficient querying. Additionally, upgraded connectors for Iceberg, Delta Lake, Hive, and other popular data sources ensure faster and more capable operations.

 Simplified Management

Dell has streamlined operations with new features to ensure system robustness and security and Dell support teams can now easily assess cluster health before or after installation or upgrades, ensuring zero downtime. The system also sends critical hardware failure alerts directly to Dell Support for proactive handling. Additionally, optional end-to-end encryption for internal components is available to secure the Lakehouse.

 Expanded Accessibility

Dell has now introduced and offers a new 5-year software subscription option, complementing the existing 1 and 3-year subscriptions, to align hardware and software support terms. To meet growing demand, the Dell Data Lakehouse is now available in more countries across Europe, Africa, and Asia. Additionally, customers can now access the Dell Data Lakehouse in the Dell Demo Center and soon in the Customer Solution Center for interactive exploration and validation.

Speaking about the new updates in Dell’s Modern Data Lakehouse, Vrashank Jain, Product Manager – Data Management at Dell Technologies said, “Dell Data Lakehouse with Warp Speed sets a new benchmark in data lake analytics, empowering organisations to derive insights from their data more quickly and efficiently than ever before. Warp Speed unlocks the full potential of the Dell Data Lakehouse, paving the way for accelerated and budget-friendly innovation and growth in the AI era.”

To get a full, hands-on experience, visit the Dell Demo Center to interactively explore the Dell Data Lakehouse with labs developed by Dell Technologies’ experts. Businesses and organisations can also contact your Dell account executive to explore the Dell Data Lakehouse for your data needs.