Title
Latent AI – Adaptive AI for the Intelligent Edge
Go Home
Description
Latent AI - Adaptive AI for the Intelligent Edge
Address
Phone Number
+1 609-831-2326 (US) | Message me
Site Icon
Latent AI – Adaptive AI for the Intelligent Edge
Page Views
0
Share
Update Time
2022-07-11 09:56:19

"I love Latent AI – Adaptive AI for the Intelligent Edge"

www.latentai.com VS www.gqak.com

2022-07-11 09:56:19

AboutProductsTeamBlogCareersContactAboutProductsTeamBlogCareersContact Adaptive AI for the Intelligent Edge Latent AI Breaking News:  Booz Allen Invests in Adaptive AI Company Latent AIWhat We DoWe take the hard work out of AI processing on the edge.  Latent AI’s LEIP platform enables adaptive AI at the edge by optimizing for compute, energy and memory without requiring changes to existing AI/ML infrastructure and frameworks.Who We AreWe’re an early stage venture spinout of SRI International, well-funded by industry-leading investors with support from Fortune 500 clients. Our seasoned team has many years of experience in machine learning, AI, computer vision, embedded systems, IoT applications, and high performance computing.What We CreateLatent AI Efficient Inference Platform (LEIP) is a modular, fully-integrated workflow designed to train, quantize, adapt and deploy edge AI neural networks. Learn more about our available tools LEIP Compress and LEIP Compile below.Just Released:  Unlocking the Power of Edge AILatent AI's Technologies Benefit Your Edge AI DevelopmentAdaptive Dynamically throttles accuracy for compute efficiency Hardware Agnostic Supports any processor platform for edge and server Dynamic AI Workload No math intensive ops, setup processing resources at runtime Lower Memory Use Ultra compact footprint for deep learning Our latest research presented at the tinyML 2021 Research Symposium:Quantization-Guided Training for Compact TinyML ModelsIntroducingLatent AI Efficient Inference Platform (LEIP)LEIP is a modular, fully-integrated workflow designed to train, quantize and deploy edge AI neural networks.   LEIP Compress and LEIP Compile are available now, with more module capabilities on the way in the near future.Who Should Use LEIP?LEIP is designed for AI, embedded and software application developers to easily enable, deploy and manage AI for the edge.Module BenefitsLEIP CompressTooling designed for edge AI use cases LEIP Compress is a state-of-the-art quantization optimizer that supports both post training and training aware quantization.Compresses neural networks to balance the optimization of performance and resource usage based on the specification of the target hardware.Time savings from removing unnecessary iterations required from traditional methods of tuning and pruning.Cost savings from reducing specialized personnel dedicated to quantizing and optimizing AI models.LEIP CompileLEIP Compile optimizes neural network processing for hardware processor targets.The first product with integrated Deep Neural Network training and compiler frameworkA seamless end-to-end integrated workflow, from ML training framework to an executable binary running on target edge AI hardwareFull optimization to compress library files by 10x while compiling Deep Neural Networks in a matter of hours while running 5x lower latencyLatent AI Example Use Cases Time Critical Inference Limited to No Network Access Security & Privacy Resource Constrained Environments Efficiency/Cost Savings Collision Detection for RobotsHealthcareActive Noise CancellingConsumer ElectronicsPressure and Vibration AnalysisIndustrial ManufacturingRemote Oil & Gas platform ControlIndustrial ManufacturingIntruder Alert System for Home SurveillanceConsumer ElectronicsBaby Monitoring SystemConsumer ElectronicsSentiment analysis of Shoppers in retail shelvesRetailViewer analysis of TV showsConsumer ElectronicsFactory Production Quality AssuranceIndustrial ManufacturingAugumented Reality WearablesMultipurposeDrone SurveilanceMultipurposeHealth trackers - WearablesConsumer ElectronicsEfficient data collection through analysis on deviceIndustrial ManufacturingLow-high res video on device with reduced silicon requirementsConsumer Electronics Time Critical Inference Collision Detection for RobotsHealthcareActive Noise CancellingConsumer ElectronicsPressure and Vibration AnalysisIndustrial Manufacturing Limited to No Network Access Remote Oil & Gas platform ControlIndustrial ManufacturingIntruder Alert System for Home SurveillanceConsumer ElectronicsBaby Monitoring SystemConsumer Electronics Security & Privacy Sentiment analysis of Shoppers in retail shelvesRetailViewer analysis of TV showsConsumer ElectronicsFactory Production Quality AssuranceIndustrial Manufacturing Resource Constrained Environments Augumented Reality WearablesMultipurposeDrone SurveilanceMultipurposeHealth trackers - WearablesConsumer Electronics Efficiency/Cost Savings Efficient data collection through analysis on deviceIndustrial ManufacturingLow-high res video on device with reduced silicon requirementsConsumer Electronics We’re hiring!  Visit our Careers page to learn more or email us at [email protected]“SRI International has long been at the forefront of research in the rapidly evolving field of, machine learning, computer vision, and robotics. We are proud to share SRI’s deep expertise and cutting-edge research in these areas with Latent AI to accelerate a solution that will bring the visionary promise of computing and AI to real-world applications.”Manish Kothar, Ph.D., SRI International President“The edge needs AI, and AI needs the edge. LatentAI integrates both with a portfolio of IoT edge compute optimizers and accelerators that bring an order of magnitude improvement to existing infrastructure. This is essential as the majority of new software today is AI and most compute cycles will shift to the edge.”Steve Jurvetson, Founder and Manager Partner, Future Ventures“The rapid evolution of artificial intelligence has led to a redefining of performance requirements at the edge.  Jags Kandasamy and his team at Latent AI have demonstrated significant expertise in their ability to optimize edge performance without changing existing AI infrastructure. We look forward to working with LatentAI as the team continues to execute on its vision.”Anurag Jain, Managing Partner, Perot Jain“I provided funding to this group when they were at SRI and I was a DARPA Program Manager.  I was impressed with their approach to Deep Learning.  I am excited to see that they will be commercializing the technology.  It is a high quality team that has been doing some of the best work in both low precision and temporal based Deep Learning.”Dr. Dan Hammerstrom, Professor Emeritus, Portland State University, and former DARPA Program ManagerLatest Latent AI News.vc_custom_1559338238938{padding-top: 20px !important;padding-right: 20px !important;padding-bottom: 20px !important;padding-left: 20px !important;background-color: #f9f9f9 !important;}.vc_custom_1419240793832{background-color: rgba(0,0,0,0.3) !important;*background-color: rgb(0,0,0) !important;}Creating the perfect Iron Chef AI RecipeRead moreSolving edge AI challenges in a hardware conflicted worldRead moreLatent AI gives back at SBHS HackathonRead moreClosing the gap between edge AI expertise and implementationRead moreLatent AI named IoT Emerging Company of the Year for the Enterprise Market Read moreLatent AI Named Exploding Topic Top Edge AI StartupRead moreDeploy optimized AI models quickly with Latent AI LEIP RecipesRead moreJoin Latent AI at TinyML Summit San FranciscoRead moreLatentAI partners with GreenAI.cloud entering the EMEA marketRead moreLatent AI Announces New VP Engineering as Company Embarks on R&D Expansion PlansRead moreRecognitionGet in Touch!Please contact us at [email protected] for inquiries about our products or company. ResourcesNewsPrivacyWest Coast OfficeLatent AI, Inc.333 Ravenswood AvenueMenlo Park, CA 94025-3493East Coast OfficeLatent AI, Inc.30 Vreeland Drive, Bldg 30, Suite 1Skillman, NJ [email protected]© Copyright Latent AI, Inc. 2019. All Rights Reserved. .ct-color-white { color: #fff;}.ct-color-white a { color: inherit;}.ct-color-white .widget-title { color: inherit;}.ct-color-white .cat-item a::after { background-color: #fff;}@media (max-width: 767px) { .ct-text-xs-center { text-align: center !important; }}.ct-color-white input[type="email"],.ct-color-white input[type="search"] { color: inherit; border: none !important; background-color: #2b2b2b;}.vc_custom_1503545850758{margin-bottom: 0px !important;} .sfsi_subscribe_Popinner { width: 100% !important; height: auto !important; padding: 18px 0px !important; background-color: #ffffff !important; } .sfsi_subscribe_Popinner form { margin: 0 20px !important; } .sfsi_subscribe_Popinner h5 { font-family: Helvetica,Arial,sans-serif !important; font-weight: bold !important; color: #000000 !important; font-size: 16px !important; text-align: center !important; margin: 0 0 10px !important; padding: 0 !important; } .sfsi_subscription_form_field { margin: 5px 0 !important; width: 100% !important; display: inline-flex; display: -webkit-inline-flex; } .sfsi_subscription_form_field input { width: 100% !important; padding: 10px 0px !important; } .sfsi_subscribe_Popinner input[type=email] { font-family: Helvetica,Arial,sans-serif !important; font-style: normal !important; color: #000000 !important; font-size: 14px !important; text-align: center !important; } .sfsi_subscribe_Popinner input[type=email]::-webkit-input-placeholder { font-family: Helvetica,Arial,sans-serif !important; font-style: normal !important; color: #000000 !important; font-size: 14px !important; text-align: center !important; } .sfsi_subscribe_Popinner input[type=email]:-moz-placeholder { /* Firefox 18- */ font-family: Helvetica,Arial,sans-serif !important; font-style: normal !important; color: #000000 !important; font-size: 14px !important; text-align: center !important; } .sfsi_subscribe_Popinner input[type=email]::-moz-placeholder { /* Firefox 19+ */ font-family: Helvetica,Arial,sans-serif !important; font-style: normal !important; color: #000000 !important; font-size: 14px !important; text-align: center !important; } .sfsi_subscribe_Popinner input[type=email]:-ms-input-placeholder { font-family: Helvetica,Arial,sans-serif !important; font-style: normal !important; color: #000000 !important; font-size: 14px !important; text-align: center !important; } .sfsi_subscribe_Popinner input[type=submit] { font-family: Helvetica,Arial,sans-serif !important; font-weight: bold !important; color: #000000 !important; font-size: 16px !important; text-align: center !important; background-color: #dedede !important; } .sfsi_shortcode_container { float: left; } .sfsi_shortcode_container .norm_row .sfsi_wDiv { position: relative !important; } .sfsi_shortcode_container .sfsi_holders { display: none; } .rev_slider .slotholder:after{width:100%; height:100%; content:""; position:absolute; left:0; top:0; pointer-events:none; background:rgba(0,0,0,0.25)}#rev_slider_38_1_wrapper rs-loader.spinner2{background-color:#FFFFFF !important}We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.OkPrivacy policy