2025³â 07¿ù 28ÀÏ ¿ù¿äÀÏ
 
 
  ÇöÀçÀ§Ä¡ > ´º½ºÁö´åÄÄ > Science & Technology

·£¼¶¿þ¾îºÎÅÍ µÅÁöµµ»ì±îÁö... ³ë·ÃÇØÁø »ç±âÇà°¢

 

Á¤Ä¡

 

°æÁ¦

 

»çȸ

 

»ýȰ

 

¹®È­

 

±¹Á¦

 

°úÇбâ¼ú

 

¿¬¿¹

 

½ºÆ÷Ã÷

 

ÀÚµ¿Â÷

 

ºÎµ¿»ê

 

°æ¿µ

 

¿µ¾÷

 

¹Ìµð¾î

 

½Å»óǰ

 

±³À°

 

ÇÐȸ

 

½Å°£

 

°øÁö»çÇ×

 

Ä®·³

 

Ä·ÆäÀÎ
Çѻ츲 ¡®¿ì¸®´Â ÇѽҸ²¡¯ ½Ò ¼Òºñ Ä·ÆäÀÎ ½Ã...
1000¸¸¿øÂ¥¸® Àΰø¿Í¿ì, °Ç°­º¸Çè Áö¿ø ¡®Æò...
- - - - - - -
 

Generative AI Company, FriendliAI Releases Public Beta of PeriFlow Cloud

´º½ºÀÏÀÚ: 2023-07-21

SEOUL-- July 21, 2023 -- FriendliAI, a leading generative AI engine company, is proud to announce the public beta release of PeriFlow Cloud. This powerful platform empowers users to run PeriFlow, an engine for generative AI serving, within a managed cloud environment.

With its innovative approach specifically tailored to large language models (LLMs), the PeriFlow engine achieves remarkable improvements in throughput while maintaining low latency. This cutting-edge engine is built upon FriendliAI’s groundbreaking batching and scheduling techniques, which are protected by patents in the United States and Korea, including U.S. Patent No. 11,514,370, U.S. Patent No. 11,442,775, Korean Patent No. 10-2498595, and Korean Patent No. 10-2479264.

PeriFlow is fast and versatile, attracting a growing number of companies that develop their own LLMs through pretraining or fine-tuning open-source LLMs. Supporting a broad range of LLMs, including GPT, GPT-J, GPT-NeoX, MPT, LLaMA, Dolly, OPT, BLOOM, T5, FLAN, UL2, and more, PeriFlow offers diverse decoding options such as greedy, top-k, top-p, beam search, and stochastic beam search. Furthermore, it supports multiple data types, including fp32, fp16, bf16, and int8. With PeriFlow, users can optimize the balance between precision and speed.

FriendliAI also offers PeriFlow as a container solution, named PeriFlow Container, which has gained considerable traction among companies for LLM serving. For instance, Scatter Lab, a prominent social chatbot company in Korea, optimizes their high user traffic by leveraging PeriFlow Container to run multiple LLMs, including the popular Luda 2.0. As a result, Scatter Lab has achieved a remarkable 50% reduction in infrastructure costs associated with serving.

The Benefits of PeriFlow Cloud


PeriFlow Cloud simplifies the adoption of PeriFlow for organizations of any scale. With PeriFlow Cloud, users can enjoy exceptional speed at low costs (70~90% GPU savings) for LLM serving without the hassle of cloud resource setup and management.

Through PeriFlow Cloud, users can centrally manage every deployed LLM from anywhere. Users are able to effortlessly upload model checkpoints, deploy models, and instantly send inference requests. Comprehensive monitoring tools empower users to track events, errors, and performance metrics while interactively testing deployed LLMs in the playground. It dynamically handles performance and fault issues while auto-scaling based on traffic patterns, freeing users to focus on creating LLMs and driving innovation.

Byung-Gon Chun, Founder & CEO of FriendliAI, emphasizes the significance of efficient LLM serving, stating “Generative AI is revolutionizing our lives, enabling more creative, intelligent, and productive services. Many organizations are now training their own models, but they have yet to fully realize how costly and painful it is to serve these models at scale for a large user base.”

“We’re due for a significant transformation in the way we serve LLMs to empower organizations to fully harness the potential of their LLMs,” Chun adds. “PeriFlow Cloud is an instant and cost-effective solution. We are incredibly excited to see the innovative services users will develop with their generative AI models, powered by PeriFlow Cloud.”

Get Started with PeriFlow Cloud Today


The public beta version of PeriFlow Cloud is now available. Users can deploy their large language models (LLMs) on PeriFlow, the fastest generative AI inference serving engine, in a matter of minutes. Visit the official website to get started today.



 Àüü´º½º¸ñ·ÏÀ¸·Î

Software-Defined Networks Aid Firms¡¯ Agility in Asia Pacific
Airship Unveils AI Agents to Speed Brands' Delivery of High-Impact Customer Experiences
IQM to Deliver World-leading 300-qubit Quantum Computer to Finland
Liquid AI Releases World¡¯s Fastest and Best-Performing Open-Source Small Foundation Models
Belkin Achieves Qi2.2 Certification for Its Upcoming Products, Unlocking the Future of 25W Wireless Charging
LG OLED evo M5, the Only True Wireless OLED TV With G5¡¯s Excellence in Picture Quality
IQM Announces Major Upgrade to Resonance Quantum Cloud Platform With New Software Development Kit

 

NETSCOUT Advances AI-Driven Network Operations for TM Forum¡¯s NeuroNO...
New Murata Automotive-Compliant Chip Ferrite Beads Deliver Wide Band N...
IQM Announces Major Upgrade to Resonance Quantum Cloud Platform With N...
ExaGrid Announces Support of Rubrik
LG Expands Into Smart Logistics, Building on Lighthouse Factory Expert...
Biocytogen Enters into Antibody Licensing Agreement with BeOne Medicin...
DNP to Take Controlling-Stake in Laxton

 


°øÁö»çÇ×
´º½ºÁö ÇÑÀÚ Ç¥±â¿¡ ´ë¸¸½Ä À½Â÷ Ç¥±â '纽ÞÙó¢ ´Ï¿ì½ÃÁö' º´±â
º£³×ÇÁ·Ò º£³×ÀÎÅõ Áß¹® Ç¥±â 宝Ò¬ÜØÙÌ 宝Ò¬ì×öõ(ÜÄÒ¬ÜØÙÌ ÜÄ...
¹Ìµð¾î¾Æ¿ì¾î Mediaour ØÚ体ä²们 ØÚô÷ä²Ùú MO ¿¥¿À ØÚä² ØÚä²
¾Ë¸®¿ìºê Alliuv ä¹备: ä¹联êó备, ¾Ë¶ã Althle ä¹÷åìÌ
¾Ë¸®¾Ë Allial Áß¹® Ç¥±â ä¹××尔 ä¹××ì³
´ºÆÛ½ºÆ® New1st Áß¹® Ç¥±â 纽ììãæ(¹øÃ¼ Òïììãæ), N1 纽1
¿£ÄÚ½º¸ð½º : À̾¾ 'EnCosmos : EC' Áß¹® Ç¥±â ì¤ñµ
¾ÆÀ̵ð¾î·Ð Idearon Áß¹® Ç¥±â ì¤îè论 ì¤îèÖå
¹ÙÀÌ¿ÀÀÌ´Ï Bioini Áß¹® Ç¥±â ù±药研 ù±å·æÚ
¿À½ºÇÁ·Ò Ausfrom 奥ÞÙÜØÙÌ, À£ÇÁ·Ò Welfrom 卫ÜØÙÌ
¿¡³ÊÇÁ·Ò Enerfrom 额ÒöÜØÙÌ ¿¡³ÊÀ¯ºñ Eneruv 额Òöêó备
¾ËÇÁ·Ò Alfrom Áß¹® Ç¥±â ä¹尔ÜØÙÌ ä¹ì³ÜØÙÌ

 

ȸ»ç¼Ò°³ | ÀÎÀçä¿ë | ÀÌ¿ë¾à°ü | °³ÀÎÁ¤º¸Ãë±Þ¹æÄ§ | û¼Ò³âº¸È£Á¤Ã¥ | Ã¥ÀÓÇѰè¿Í ¹ýÀû°íÁö | À̸ÞÀÏÁÖ¼Ò¹«´Ü¼öÁý°ÅºÎ | °í°´¼¾ÅÍ

±â»çÁ¦º¸ À̸ÞÀÏ news@newsji.com, ÀüÈ­ 050 2222 0002, ÆÑ½º 050 2222 0111, ÁÖ¼Ò : ¼­¿ï ±¸·Î±¸ °¡¸¶»ê·Î 27±æ 60 1-37È£

ÀÎÅͳݴº½º¼­ºñ½º»ç¾÷µî·Ï : ¼­¿ï ÀÚ00447, µî·ÏÀÏÀÚ : 2013.12.23., ´º½º¹è¿­ ¹× û¼Ò³âº¸È£ÀÇ Ã¥ÀÓ : ´ëÇ¥ CEO

Copyright ¨Ï All rights reserved..