Artificial Intelligence

  • What is it?
    Artificial Intelligence (AI) is a broad branch of computer science. The goal of AI is to create machines that can function intelligently and independently, and that can work and react the same way as humans. To build these abilities, machines and the software & applications that enable them need to derive their intelligence in the same way that humans do – by retaining information and becoming smarter over time.

    AI is not a new concept – the idea has been in discussion since the 1950s – but it has only become technically feasible to develop and deploy into the real world relatively recently due to advances in technology – such as our ability to now collect and store huge amounts of data that are required for machine learning, and also the rapid increases in processing speeds and computing capabilities which make it possible to process the data collected to train a machine / application and make it "smarter".

  • Why do you need it?
    Although we tend to associate AI with the image of self-aware robot that can move, act and think just like a human being (courtesy of countless science fiction films), you could be already using AI more than you know – for example, YouTube or Netflix rely on AI to make user video recommendations, classify content or censor inappropriate material, and speech recognition or language translation platforms like Amazon Alexa or Google Translate also use AI to be able to better understand real-world speech or perform translation – and as users interact with these applications they become smarter by remembering user behavior or reactions. AI will be a key enabler of many technologies that are on the verge of being deployed into the mainstream, such as autonomous driving technology or flying drones used for package delivery. The importance of AI for these applications is the ability to be able to make decisions independently in real-time based on real world data, and to learn from this data and feedback from the user & environment to become more accurate over time.

  • How is GIGABYTE helpful?
    Currently one of the most widely adopted methods to develop artificial intelligence in machines and applications is with machine learning, and its advanced variant Deep Learning, which adopts Deep Neural Networks (DNN) models - complicated algorithms similar in structure and function to the human brain. Deep Learning requires not only a large amount of data (which can be stored and processed with GIGABYTE's Storage Servers and / or High Density Server), but also massive parallel computing power to train an algorithm based on this data. GIGABYTE's GPU Server (such as G481-S80 or G291-280) are ideal for this task.

    GIGABYTE also has developed a DNN Training Appliance, a fully integrated software and hardware stack built on our G481-HA1 server for hassle-free machine learning environment setup, management and monitoring, and includes hardware and software optimizations that reduce the time required and improve the accuracy of DNN training jobs.

  • WE RECOMMEND
    RELATED ARTICLES
    What is a Server? A Tech Guide by GIGABYTE
    In the modern age, we enjoy an incredible amount of computing power—not because of any device that we own, but because of the servers we are connected to. They handle all our myriad requests, whether it is to send an email, play a game, or find a restaurant. They are the inventions that make our intrinsically connected age of digital information possible. But what, exactly, is a server? GIGABYTE Technology, an industry leader in high-performance servers, presents our latest Tech Guide. We delve into what a server is, how it works, and what exciting new breakthroughs GIGABYTE has made in the field of server solutions.
    Spain’s IFISC Tackles COVID-19, Climate Change with GIGABYTE Servers
    By using GIGABYTE, Spain’s Institute for Cross-Disciplinary Physics and Complex Systems is pitting the world’s foremost server solutions against some of the world’s most pressing issues, including the effects of climate change, the effects of pollution, and the COVID-19 pandemic. GIGABYTE servers are up to the diverse and daunting tasks, because they are designed for high performance computing, intensive numerical simulations, AI development, and big data management.
    Lowell Observatory Looks for Habitable Exoplanets with GIGABYTE Servers
    Arizona’s Lowell Observatory is studying the Sun with GIGABYTE’s G482-Z50 GPU Server in an effort to filter out “stellar noise” when looking for habitable planets outside of our Solar System. The server’s AMD EPYC™ processors, parallel computing capabilities, excellent scalability, and industry-leading stability are all features that qualify it for this astronomical task, making the discovery of a true “Twin Earth” achievable within our lifetime.
     How to Build Your Data Center with GIGABYTE? A Free Downloadable Tech Guide
    GIGABYTE is pleased to publish our first long-form “Tech Guide”: an in-depth, multipart document shedding light on important tech trends or applications, and presenting possible solutions to help you benefit from these innovations. In this Tech Guide, we delve into the making of “Data Centers”—what they are, who they are for, what to keep in mind when building them, and how you may build your own with products and consultation from GIGABYTE.
    Back to top