Inference Engine

  • What is it?
    In the field of Artificial Intelligence, inference engine is a component of the system that applies logical rules to the knowledge base to deduce new information. The inference engine applies logical rules to the knowledge base and deduced new knowledge typically represented as IF-THEN rules.

  • WE RECOMMEND
    RELATED ARTICLES
    To Harness Generative AI, You Must Learn About “Training” & “Inference”

    Tech Guide

    To Harness Generative AI, You Must Learn About “Training” & “Inference”

    Unless you’ve been living under a rock, you must be familiar with the “magic” of generative AI: how chatbots like ChatGPT can compose anything from love letters to sonnets, and how text-to-image models like Stable Diffusion can render art based on text prompts. The truth is, generative AI is not only easy to make sense of, but also a cinch to work with. In our latest Tech Guide, we dissect the “training” and “inference” processes behind generative AI, and we recommend total solutions from GIGABYTE Technology that’ll enable you to harness its full potential.
    Ushering in a New Era of Smart Healthcare Applications with 5G-based MEC

    5G

    Ushering in a New Era of Smart Healthcare Applications with 5G-based MEC

    A new epoch of smart medicine has been made possible with the proliferation of 5G. Healthcare has advanced from the treatment of symptoms to the early detection of disease and the tracking of each individual’s condition. GIGABYTE Technology offers the highly scalable, high-density E-Series Edge Servers for edge computing; by working with Alpha Networks Inc., GIGABYTE has brought the benefits of multi-access edge computing (MEC) to wearable devices and other Internet of Things (IoT) applications that will help to realize the vision of smart medicine. The compilation of big data, coupled with the low-latency characteristics of MEC, allow for faster data integration and analysis that will help healthcare providers offer customized, predictive care for their patients.