Milestone Systems, una empresa de soluciones de vídeo con sede en Dinamarca, ha abierto su primer centro de experiencia en la India. Esto la convierte en la tercera instalación de este tipo que la empresa abre en la región de Asia Pacífico (APAC). La compañía también presentó Milestone Kite, un software de videovigilancia como servicio (VSaaS), que ofrece funciones como seguimiento de visitantes y análisis de inteligencia artificial (IA). Se trata de una oferta centrada en empresas con múltiples ubicaciones y una pequeña cantidad de cámaras instaladas en cada edificio.
Milestone Systems ha abierto su Centro de Experiencia en India
En un comunicado de prensa (a través de Analytics India Magazine, la empresa de soluciones de vídeo con sede en Dinamarca, ha anunciado el lanzamiento de su primer Centro de experiencia ubicado en Bengaluru. El Experience Center está afiliado a las tres oficinas de Milestone Systems en Mumbai, Delhi y Bangalore. Se dice que el último movimiento de la compañía es ampliar su presencia en el mercado indio.
El nuevo Centro de Experiencia, la tercera instalación de este tipo de la compañía en la región de Asia Pacífico, ofrece un espacio interactivo para que los visitantes observen y exploren las innovaciones de la nueva era en tecnología de video. La empresa utiliza específicamente estas tecnologías en las industrias de automatización de fabricación, seguridad de centros de datos y seguridad sanitaria. El lanzamiento está en línea con la iniciativa Viksit Bharat 2047 del país, dijo la compañía en el comunicado de prensa.
Con su nueva incursión en la India, Milestone Systems pretende apoyar el desarrollo económico del país en sectores como la fabricación de automóviles, las ciudades inteligentes, las infraestructuras críticas y los centros de datos. Afirmó que sus soluciones de vídeo podrían ayudar a crear espacios más seguros en lugares sensibles como escuelas y hospitales. La tecnología también puede mejorar la eficiencia de la infraestructura, añadió la empresa.
Se presenta Milestone Kite, un software VSaaS
Además del lanzamiento del Experience Center, la compañía también anunció el lanzamiento de Milestone Kite, un software VSaaS diseñado para empresas. Compatible con más de 25.000 dispositivos en todo el mundo, Milestone Kite es un software basado en la nube que puede integrarse en el sistema de seguridad de empresas de diferentes tamaños y estructuras. Sin embargo, la empresa enfatiza que es más adecuado para empresas con múltiples ubicaciones y algunas cámaras de seguridad.
Milestone Kite puede funcionar incluso con soporte de TI y ancho de banda limitados. La compañía afirma que también puede brindar soporte en ubicaciones donde los dispositivos de puerta de enlace no se pueden instalar en el sitio. En términos de ofertas, ofrece informes de incidentes en tiempo real, seguimiento de visitantes y análisis basados en inteligencia artificial.
A cubic millimetre is a tiny volume — less than a teardrop. But a cubic millimetre of mouse brain is densely packed with tens of thousands of neurons and other cells in a staggeringly complex architectural weave.
Reconstructing such elaborate arrangements requires monumental effort, but the researchers affiliated with the Machine Intelligence from Cortical Networks (MICrONS) programme pulled it off. It took US$100 million and years of effort by more than 100 scientists, coordinated by 3 groups that had never collaborated before. There were weeks of all-nighters and a painstaking global proofreading effort that continues even now — for a volume that represents just 0.2% of the typical mouse brain. Despite the hurdles, the core of the project — conceived and funded by the US Intelligence Advanced Research Projects Activity (IARPA) — is complete.
Human brain mapping
The resulting package includes a high-resolution 3D electron microscopy reconstruction of the cells and organelles in two separate volumes of the mouse visual cortex, coupled with fluorescent imaging of neuronal activity from the same volumes. Even the coordinators of the MICrONS project, who describe IARPA’s assembly of the consortium as a ‘shotgun wedding’ of parallel research efforts, were pleasantly surprised by the outcome. “It formed this contiguous team, and we’ve been working extremely well together,” says Andreas Tolias, a systems neuroscientist who led the functional imaging effort at Baylor College of Medicine in Houston, Texas. “It’s impressive.”
The MICrONS project is a milestone in the field of ‘connectomics’, which aims to unravel the synaptic-scale organization of the brain and chart the circuits that coordinate the organ’s many functions. The data from these first two volumes are already providing the neuroscience community with a valuable resource. But this work is also bringing scientists into strange and challenging new territory. “The main casualty of this information is understanding,” says Jeff Lichtman, a connectomics pioneer at Harvard University in Cambridge, Massachusetts. “The more we know, the harder it is to turn this into a simple, easy-to-understand model of how the brain works.”
Short circuits
There are many ways to look at the brain, but for connectivity researchers, electron microscopy has proved especially powerful.
In 1986, scientists at the University of Cambridge, UK, used serial-section electron microscopy to generate a complete map of the nervous system for the roundworm Caenorhabditiselegans1. That connectome was a landmark achievement in the history of biology. It required the arduous manual annotation and reconstruction of some 8,000 2D images, but yielded a Rosetta Stone for understanding the nervous system of this simple, but important, animal model.
The rise of digital twins
No comparable resource exists for more complex animals, but early forays into the rodent connectome have given hints of what such a map could reveal. Lichtman recalls the assembly he and his colleagues produced in 2015 from a 1,500-cubic-micron section of mouse neocortex — roughly one-millionth of the volume used in the MICrONS project2. “Most people were just shocked to see the density of wires all pushed together in any little part of brain,” he says.
Similarly, Moritz Helmstaedter, a connectomics researcher at the Max Planck Institute for Brain Research in Frankfurt, Germany, says that his team’s efforts3 in reconstructing a densely packed region of the mouse somatosensory cortex, which processes sensations related to touch, in 2019 challenged existing dogma — especially the assumption that neurons in the cortex are randomly wired. “We explicitly proved that wrong,” Helmstaedter says. “We found this extreme precision.” These and other studies have collectively helped to cement the importance of electron-microscopy-based circuit maps as a complement to techniques such as light microscopy and molecular methods.
Bigger and better
IARPA’s motivation for the MICrONS project was grounded in artificial intelligence. The goal was to generate a detailed connectomic map at the cubic-millimetre-scale, which could then be ‘reverse-engineered’ to identify architectural principles that might guide the development of biologically informed artificial neural networks.
Tolias, neuroscientist Sebastian Seung at Princeton University in New Jersey, and neurobiologist Clay Reid at the Allen Institute for Brain Science in Seattle, Washington, had all applied independently for funding to contribute to separate elements of this programme. But IARPA’s programme officers elected to combine the 3 teams into a single consortium — including a broader network of collaborators — issuing $100 million in 2016 to support a 5-year effort.
A Martinotti cell, a small neuron with branching dendrites, with synaptic outputs highlighted.Credit: MICrONS Explorer
The MICrONS team selected two areas from the mouse visual cortex: the aforementioned cubic millimetre, and a much smaller volume that served as a pilot for the workflow. These were chosen so the team could investigate the interactions between disparate regions in the visual pathway, explains Tolias, who oversaw the brain-activity-imaging aspect of the work at Baylor. To achieve that, the researchers genetically engineered a mouse to express a calcium-sensitive ‘reporter gene’, which produces a fluorescent signal whenever a neuron or population of neurons fires. His team then assembled video footage of diverse realistic scenes, which the animal watched with each eye independently for two hours while a microscope tracked neuronal activity.
Probing fine-scale connections in the brain
The mouse was then shipped to Seattle for preparation and imaging of the relevant brain volumes — and the pressure kicked up another notch. Nuno da Costa, a neuroanatomist and associate investigator at the Allen Institute, says he and Tolias compressed their groups’ schedules to accommodate the final, time-consuming stage of digital reconstruction and analysis conducted by Seung’s group. “We really pushed ourselves to deliver — to fail as early as possible so we can course-correct in time,” da Costa says. This meant a race against the clock to excise the tissue, carve it into ultra-thin slices and then image the stained slices with a fleet of 5 electron microscopes. “We invested in this approach where we could buy very old machines, and really automate them to make them super-fast,” says da Costa. The researchers could thus maximize throughput and had backups should a microscope fail.
For phase one of the project, which involved reconstructing the smaller cortical volume, sectioning of the tissue came down to the heroic efforts of Agnes Bodor, a neuroscientist at the Allen Institute, who spent more than a month hand-collecting several thousand 40-nanometre-thick sections of tissue using a diamond-bladed instrument known as a microtome, da Costa says. That manual effort was untenable for the larger volume in phase two of the project, so the Allen team adopted an automated approach. Over 12 days of round-the-clock, supervised work, the team generated almost 28,000 sections containing more than 200,000 cells4. It took six months to image all those sections, yielding some 2 petabytes of data.
The Allen and Baylor teams also collaborated to link the fluorescently imaged cells with their counterparts in the reconstructed connectomic volume.
A network of thousands of individual neurons from a small subset of cells in the Machine Intelligence from Cortical Networks project data set.Credit: MICrONS Explorer
Throughout this process, the Allen team relayed its data sets to the team at Princeton University. Serial-section electron microscopy is a well-established technique, but assembly of the reconstructed volume entails considerable computational work. Images must be precisely aligned with one another while accounting for any preparation- or imaging-associated deformations, and then they are subjected to ‘segmentation’ to identify and annotate neurons, non-neuronal cells such as glia, organelles and other structures. “The revolutionary technology in MICrONS was image alignment,” Seung says. This part is crucial, because a misstep in the positioning of a single slice can derail the remainder of the reconstruction process. Manual curation would be entirely impractical at the cubic-millimetre scale. But through its work in phase one, the team developed a reconstruction workflow that could be scaled up for the larger brain volume, and continuing advances in deep-learning methods made it possible to automate key alignment steps.
To check the work, Sven Dorkenwald, who was a graduate student in Seung’s laboratory and is now a research fellow at the Allen Institute, developed a proofreading framework to refine the team’s reconstructions and ensure their biological fidelity. This approach, which verified the paths of neuronal processes through the connectome, carved the volumes into ‘supervoxels’ — 3D shapes that define segmented cellular or subcellular features, which can be rearranged to improve connectomic accuracy — and Dorkenwald says the final MICrONS data set had 112 billion of them. The system is analogous to the online encyclopedia Wikipedia in some ways, allowing many users to contribute edits in parallel while also logging the history of changes. But even crowdsourced proofreading is slow going — Dorkenwald estimates that each axon (the neuronal projections that transmit signals to other cells) in the MICrONS data set takes up to 50 hours to proofread.
Charting new territory
The MICrONS team published a summary5 of its phase one results in 2022. Much of its other early findings still await publication, including a detailed description of the work from phase two — although this is currently available as a preprint article4. But there are already some important demonstrations of what connectomics at this scale can deliver.
FlyWire: online community for whole-brain connectomics
One MICrONS preprint, for example, describes what is perhaps the most comprehensive circuit map so far for a cortical column6, a layered arrangement of neurons that is thought to be the fundamental organizational unit of the cerebral cortex. The team’s reconstruction yielded a detailed census of all the different cell types residing in the column and revealed previously unknown patterns in how various subtypes of neuron connect with one another. “Inhibitory cells have this remarkable specificity towards some excitatory cell types, even when these excitatory cells are mixed together in the same layer,” says da Costa. Such insights could lead to more precise classification of the cells that boost or suppress circuit activity and reveal the underlying rules that guide the wiring of those circuits.
Crucially, says Tolias, the MICrONS project was about more than the connectome: “It was large-scale, functional imaging of the same mouse.” Much of his team’s work has focused on translating calcium reporter-based activity measurements into next-generation computational models. In 2023, the researchers posted a preprint that describes the creation of a deep-learning-based ‘digital twin’ on the basis of experimentally measured cortical responses to visual stimuli7. The predictions generated by this ‘twin’ can then be tested, further refining the model and enhancing its accuracy.
One surprising and valuable product of the MICrONS effort involves fruit flies. Early in the project, Seung’s team began exploring serial-section electron-microscopy data from the Drosophilamelanogaster brain produced by researchers at the Howard Hughes Medical Institute’s Janelia Research Campus in Ashburn, Virginia8. “I realized that because we had developed this image-alignment technology, we had a chance to do something that people thought was impossible,” says Seung. His team — including Dorkenwald — used the Janelia data as a proving ground for the algorithms that had been developed for MICrONS. The result was the first complete assembly of the fruit-fly brain connectome — around 130,000 neurons in total9.
Given that the wiring of the nervous system is generally conserved across fruit flies, Dorkenwald is enthusiastic about how these data — which are publicly accessible at http://flywire.ai — could enable future experiments. “You can do functional imaging on a fly, and because you can find the same neurons over in the connectome, you will be able to do these functional-structure analyses,” he says.
The mouse connectome will not be so simple, because connectivity varies from individual to individual. But the MICrONS data are nevertheless valuable for the neuroscience community, says Helmstaedter, who was not part of the MICrONS project. “It’s great data, and it’s inspiring people just to go look at it and see it,” he says. There’s also the power of demonstrating what is possible, and how it could be done better. “You’ve got to do something brute force first to find out where you can make it easier the next round,” says Kristen Harris, a neuroscientist at the University of Texas at Austin. “And the act of doing it — just getting the job done — is just spectacular.”
Terra incognita
Even as analysis of the MICrONS data set proceeds, its limitations are already becoming clear. For one thing, volumes from other distinct cortical regions will be needed to identify features that are broadly observed throughout the brain versus those features that are distinct to the visual cortex. And many axons from this first cubic millimetre will inevitably connect to points unknown, Lichtman notes, limiting researchers’ ability to fully understand the structure and function of the circuits within it.
Scaling up will be even harder. Lichtman estimates that a whole-brain electron-microscopy reconstruction would produce roughly an exabyte of data, which is equivalent to a billion gigabytes and is 1,000 times greater than the petabytes of data produced by the MICrONS project. “This may be a ‘Mars shot’ — it’s really much harder than going to the Moon,” he says.
Still, the race is under way. One major effort is BRAIN CONNECTS, a project backed by the US National Institutes of Health with $150 million in funding, which is coordinated by multiple researchers, including Seung, da Costa and Lichtman. “We’re not delivering the whole mouse brain yet, but testing if it’s possible,” da Costa says. “Mitigating all the risks, bringing the cost down, and seeing if we can actually prepare a whole-mouse-brain or whole-hemisphere sample.”
In parallel, Lichtman is working with a team at Google Research in Mountain View, California, led by computer scientist Viren Jain — who collaborated with MICrONS and is also part of the BRAIN CONNECTS leadership team — to map sizable volumes of the human cortex using electron microscopy. They’ve already released data from their first cubic millimetre and have plans to begin charting other regions from people with various neurological conditions10.
NatureTech hub
These efforts will require improved tools. The serial-section electron-microscopy strategy that MICrONS used is too labour-intensive to use at larger scales and yields relatively low-quality data that are hard to analyse. But alternatives are emerging. For example, ‘block-face’ electron-microscopy methods, in which the sample is imaged as a solid volume and then gradually shaved away with a high-intensity ion-beam, require less work in terms of image alignment and can be applied to thick sections of tissue that are easier to manage. These methods can be combined with cutting-edge multi-beam scanning electron microscopes, which image specimens using up to 91 electron beams simultaneously, thus accelerating data collection. “That’s one of the leading contenders for scale up to a whole mouse brain,” says Seung, who will be working with Lichtman on this strategy.
Further automation and more artificial-intelligence tools will also be assets. Helmstaedter and his colleagues have been looking into ways to simplify image assembly with an automated segmentation algorithm called RoboEM, which traces neural processes with minimal human intervention and can potentially eliminate a lot of the current proofreading burden11. Still, higher-quality sample preparation and imaging are probably the true key to efficiency at scale, Helmstaedter says. “The better your data, the less you have to worry about automation.”
However they are generated, making sense of these connectome maps will take more than fancy technology. Tolias thinks “it will be almost impossible” to replicate the coupling of structure and activity produced by MICrONS at the whole-brain scale. But it’s also unclear whether that will be necessary and to what extent functional information can be inferred through a better understanding of brain structure and organization.
For Lichtman, the connectome’s value will ultimately transcend conventional hypothesis-driven science. A connectome “forces you to see things you weren’t looking for, and yet they’re staring you in the face”, he says. “I think if we do a whole mouse brain, there will be just an infinite number of ‘wow, really?’ discoveries.”
April 9, 2007: Apple sells its 100 millionth iPod. Coming five-and-a-half years after the portable music player went on sale, the landmark event confirms the iPod as Apple’s most popular product of all time.
Until the iPhone arrives a couple months later, that is!
iPod becomes Apple’s biggest product
Launched in 2001, the original iPod famously put “1,000 songs in your pocket” on a 5GB hard drive. The device capitalized on the rise of digital audio files to replace physical recordings like CDs and vinyl. It also played into Apple co-founder Steve Jobs’ digital hub strategy, which placed the Mac at the center of consumers’ lives, and laid the groundwork for the iPhone.
The devices spawned an enormous ecosystem of more than 4,000 accessories, from cases to standalone speakers. Plus, upward of 70% of cars produced in the United States in 2007 offered iPod connectivity.
iPod and iTunes drive Apple’s success
Along with the success of the iTunes Music Store — the third-biggest music store in the United States at the time — the iPod represented Apple’s ascendancy to the pinnacle of the tech world.
It was an amazing turnaround for a company that almost went out of business just a decade earlier.
“At this historic milestone, we want to thank music lovers everywhere for making iPod such an incredible success,” said Jobs in a statement issued by Apple. “iPod has helped millions of people around the world rekindle their passion for music, and we’re thrilled to be a part of that.”
Celebrities pay homage to the iPod
To mark the 100 million iPod sales milestone, Apple enlisted celebs from the music and sports worlds.
“It’s hard to remember what I did before the iPod,” said singer Mary J. Blige in the same press release. The iPod is “more than just a music player,” she said, “it’s an extension of your personality and a great way to take your favorite music with you everywhere you go.”
Lance Armstrong, seven-time Tour de France champion, also sang the iPod’s praises.
“I take my running shoes and my iPod with me everywhere,” said the bicyclist. “I listen to music when I run. Having my music with me is really motivating.”
Apple also took out an advertisement in The Wall Street Journal celebrating the sale of the 100 millionth iPod.
Apple marks the occasion with a full-page newspaper ad. Photo: Apple
100 million iPods was just the start, but all good things must come to an end
But this was just the start. By 2011, Apple sold a massive 300 million iPods. While the company eventually stopped reporting iPod sales, the current tally probably stands at more than 400 million. The iPhone, meanwhile, passed the 1 billionth unit sold in summer 2016 and just kept going.
“Since its introduction over 20 years ago, iPod has captivated users all over the world who love the ability to take their music with them on the go,” Apple said in a press release — entitled “The music lives on” — on May 10, 2022. “Today, the experience of taking one’s music library out into the world has been integrated across Apple’s product line — from iPhone and Apple Watch to iPad and Mac — along with access to more than 90 million songs and over 30,000 playlists available via Apple Music.”
And with that, Apple pulled the plug on the mighty iPod and marked the end of an era.
How many iPods did you own back in the day? Let us know in the comments below.
In 2023, the Securities and Exchange Commission (SEC) implemented new cybersecurity disclosure rules. These regulations mandate the disclosure of “material” threat and breach incidents within four days of occurrence, along with annual reporting on cybersecurity risk management, strategy, and governance.
The introduction of the new SEC cybersecurity requirements represents a critical milestone in the continuous fight against cyber threats. In 2023, chief information security officers (CISOs) revealed that three out of four companies in the United States were vulnerable to a material cyberattack. Consequently, cybercrime remains one of the foremost risks confronting US-based companies. Additionally, in the same year, nearly seven out of ten organizations in the United States experienced a ransomware attack within the preceding twelve months.
Cyberattacks pose significant risks to businesses, primarily in terms of financial damage. In 2024, cybercrime is projected to cost the United States alone more than $452 billion. Additionally, the loss of sensitive data is a consequential outcome of cyberattacks. In 2023, the United States ranked third globally in the percentage of companies reporting the loss of sensitive information.
Furthermore, data compromise incidents affected approximately 422 million individuals in the country in 2022, totaling 1,802 incidents. The US is recognized among the countries with high data breach density. Beyond financial and data loss implications, businesses are also wary of reputational damage, significant downtimes, and the potential loss of current customers, all of which can affect a company’s valuation and overall standing.
William Belov
Rise of awareness
Having in mind growing risks and new SEC rules, companies are strengthening their defenses, shows a recent report by Infatica, a provider in the proxy service market. According to the company’s data, the demand for proxy services searches has jumped by 106,5% over the last year. The reason behind this trend is proxies’ ability to imitate cybersecurity attacks. Therefore, using this technology companies can test their defenses.
The growing interest in proxy servers is not limited to seeking enhanced security measures alone. Searches for “free web proxy server” have risen by 5,042.9%, indicating a widespread pursuit for accessible solutions that offer anonymity. Meanwhile, the demand for “proxy server list” and “anonymous proxy server” has also seen significant upticks of 80.6% and 414.3%, respectively, highlighting the importance of reliable and discreet online operations.
While the SEC’s cybersecurity rules primarily target publicly listed companies, many of these firms depend on smaller third-party software and supply chain providers. A cyberattack at any juncture within this chain could result in significant consequences. This is why non-public entities are compelled to bolster their defenses too.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Major gap
As businesses ramp up their activities, significant gaps remain evident. A staggering 81% of security leaders acknowledge the impact of the new rules on their businesses. However, only 54% convey confidence in their organization’s ability to comply effectively. Surprisingly, merely 2% of security leaders have initiated the process of adhering to the new rules. Approximately 33% are still in the early stages, while a striking 68% feel overwhelmed by the new disclosure requirements.
Among the myriad challenges, determining the materiality of cybersecurity incidents stands out, with 49% of respondents highlighting its complexity. Additionally, 47% struggle with enhancing their disclosure processes, further complicating compliance efforts.
Here are several advices on how to prepare for complying with SEC cybersecurity rules:
1. Consolidate your cybersecurity risk data
With the new regulations mandating the disclosure of incidents upon discovery and comprehensive reports on cybersecurity strategy quarterly and annually, organizations must prioritize centralizing cybersecurity risk assessment and incident data. Consolidating this data into a single repository, rather than scattered across spreadsheet software or lost in email inboxes, increases the likelihood of meeting SEC deadlines and reduces the time spent gathering information from different departments and stakeholders for incident disclosure.
2. Acquire cyber risk quantification capabilities
Traditionally, organizations have used qualitative methods such as ordinal lists or red-yellow-and-green severity charts to assess the significance of cybersecurity incidents or other risk events. While the SEC recommends considering these assessments for incident materiality determination, quantifying cyber risk offers a more accurate insight into the financial impact of an incident. Understanding the quantified financial impact of cyber risks enables organizations to take necessary steps to mitigate costly risks or, ideally, prevent them altogether. This approach reduces the overall volume of disclosures required.
3. Optimize your incident management processes
It’s an opportune moment to conduct a comprehensive review of your organization’s incident management processes to ensure they are proficient in identifying, addressing, and reporting cybersecurity incidents. Streamlining and refining these processes facilitate the interception of cyber risks before they escalate into significant issues and enable swift reporting when necessary.
4. Enhance your cybersecurity and cyber risk governance
Ensuring compliance with the SEC’s new regulations involves adequately informing your board of directors about your organization’s cybersecurity risk management practices. Implementing robust reporting and communication processes is essential to regularly update leadership on cyber risk management efforts and any incidents experienced by the company. Furthermore, it’s crucial to articulate how these incidents may impact or are already affecting the organization’s strategy and finances.
5. Secure your third-party relationships
The updated regulations emphasize the importance of assessing cyber risk beyond the confines of your organization. Meeting the requirements for reporting on third-party cyber risk assessment and secure vendor selection underscores the necessity of establishing an effective third-party risk management program. Indeed, supply chain attacks aimed at smaller contractors and vendors frequently rank among the primary causes of cybersecurity incidents at larger organizations.
6. Improve a cyber risk culture within your teams
Digital transformation has significantly impacted nearly every organization, particularly in the years following the COVID-19 pandemic, which accelerated the shift of work and life online. Consequently, there has been a surge in employees connecting to organizational networks from various locations and devices, significantly expanding our cybersecurity attack surfaces. This shift underscores the critical importance of fostering a culture of cybersecurity risk awareness where cybersecurity is seen as everyone’s responsibility, not just the purview of the information security team. The more awareness of the threat posed by cyber risks that an organization can instill in its members, the stronger its overall cybersecurity posture will be, reducing the time needed to disclose incidents to the SEC.
While SEC regulations pose challenges, they also present opportunities. Following rules, can decrease the cybersecurity of the companies, enhance investor confidence, attract capital investment, and contribute to long-term business sustainability.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro