Rakuten wesentlich besser als Rocket


Seite 503 von 503
Neuester Beitrag: 20.12.24 14:05
Eröffnet am:15.11.14 10:58von: LibudaAnzahl Beiträge:13.563
Neuester Beitrag:20.12.24 14:05von: LibudaLeser gesamt:3.179.985
Forum:Börse Leser heute:763
Bewertet mit:
8


 
Seite: < 1 | ... | 499 | 500 | 501 | 502 |
>  

63410 Postings, 7299 Tage LibudaSignificant potential for financial performance

 
  
    #12551
17.12.24 22:54
Significant potential for improved financial performance

The company's trading price below its estimated fair value suggests potential for future appreciation, especially as it leverages AI and expands Rakuten Symphony to secure new business opportunities. The combination of these factors indicates a promising yet challenging path forward, with significant potential for improved financial performance and market share expansion.

https://simplywall.st/stocks/jp/retail/tse-4755/...uho-alliance-and-m
 

63410 Postings, 7299 Tage LibudaNew AI Models Optimized for Japanese

 
  
    #12552
18.12.24 13:19
Rakuten Unveils New AI Models Optimized for Japanese

- New Large Language Model and first Small Language Model deliver greater efficiency and aim to make AI applications accessible for everyone

Tokyo, December 18, 2024 – Rakuten Group, Inc. has unveiled two new AI models: Rakuten AI 2.0, the company’s first Japanese large language model (LLM) based on a Mixture of Experts (MoE) architecture, and Rakuten AI 2.0 mini, the company’s first small language model (SLM). Both models will be released to the open-source community to empower companies and professionals developing AI applications by Spring 2025.

Rakuten AI 2.0 is an 8x7B MoE foundation model based on the Rakuten AI 7B model released in March 2024. This MoE model is comprised of eight 7 billion parameter models, each as a separate expert. Each individual token is sent to the two most relevant experts, as decided by the router. The experts and router are continually trained together with vast amounts of high-quality Japanese and English language data.

Rakuten AI 2.0 mini is a 1.5 billion parameter foundation model and the first SLM developed by the company. The model was trained from the beginning on extensive Japanese and English language datasets curated and cleaned through an in-house multi-stage data filtering and annotation process, ensuring high-performance and accuracy in text generation tasks.

"At Rakuten, we see AI as a catalyst to augment human creativity and drive greater efficiency. Earlier this year, we launched a 7B Japanese LLM to accelerate AI solutions for local research and development," commented Ting Cai, Chief AI & Data Officer of Rakuten Group. "Our new cutting-edge Japanese LLM and pioneering SLM set new standards in efficiency, thanks to high-quality Japanese language data and innovative algorithms and engineering. These breakthroughs mark a significant step in our mission to empower Japan’s businesses and professionals to create AI applications that truly benefit users."

High efficiency with advanced architecture

Rakuten AI 2.0 employs a sophisticated Mixture of Experts architecture, which dynamically selects the most relevant experts for given input tokens, optimizing computational efficiency and performance. The model offers comparable performance to 8x larger dense models, while consuming approximately 4x less computation than dense models during inference.

Increased performance

Rakuten has conducted model evaluations with the LM Evaluation Harness*4 for Japanese and English capability measurements. The leaderboard evaluates language models based on a wide range of Natural Language Processing and Understanding tasks that reflect the characteristics of the target language. Rakuten AI 2.0’s average Japanese performance improved to 72.29 from 62.93 over eight tasks compared to Rakuten AI 7B, the open LLM Rakuten released in March 2024.

https://global.rakuten.com/corp/news/press/2024/1218_01.html
 

63410 Postings, 7299 Tage LibudaAI utilization in all aspects of our business

 
  
    #12553
18.12.24 18:26

Through the pre-training of the existing foundational model „Rakuten AI 7B” with 7 billion parameters, it became possible to accelerate the pre-training process using large and complex data by expanding the in-house designed multi-node GPU cluster. Rakuten aims to contribute to the open-source community by providing the latest LLM and SLM as open models, further advancing Japanese language LLM. By continuing to develop the latest LLM models in-house, we will accumulate knowledge and know-how, working towards the expansion of the “Rakuten Ecosystem” (economic zone).

Rakuten has adopted the theme of “AI-nization” to signify AI utilization in all aspects of our business for further growth. We are committed to promoting efforts to leverage AI across various aspects of business to drive growth. Through the utilization of abundant data and cutting-edge AI technology, we aim to create new value for people around the world.

https://www.moomoo.com/news/post/47271779/..._ticket=1734542258442720
 

63410 Postings, 7299 Tage LibudaLöschung

 
  
    #12554
19.12.24 01:48

Moderation
Zeitpunkt: 21.12.24 14:51
Aktionen: Löschung des Beitrages, Nutzer-Sperre für 6 Stunden
Kommentar: Regelverstoß

 

 

63410 Postings, 7299 Tage LibudaExcellent developement of Rakuten Bank

 
  
    #12555
19.12.24 10:24

63410 Postings, 7299 Tage Libuda5 Steps to Dramatically Save on Your Mobile Phone

 
  
    #12556
19.12.24 14:55
5 Steps to Dramatically Save on Your Mobile Phone Bill

Post published: December 13, 2024

https://mobile-contract-guide.com/en/...ve-on-mobile-bills-5-steps-en
 

30 Postings, 267 Tage tradecontoDiese Aktie ist ein Trauerspiel

 
  
    #12557
19.12.24 17:59
Da dachte man, daß es nach guten Nachrichten zum Mobile business endlich mal konstant nach oben geht, aber Rakuten ist einfach ein konstantes Verlustgeschäft für Buy-and-Hold Anleger.
Gut daß ich nur mit einer kleinen Position drin bin.  

63410 Postings, 7299 Tage LibudaLöschung

 
  
    #12558
19.12.24 18:03

Moderation
Zeitpunkt: 20.12.24 16:10
Aktionen: Löschung des Beitrages, Nutzer-Sperre für 1 Tag
Kommentar: Werbung

 

 

63410 Postings, 7299 Tage Libudazu 12557: Solide schlägt Hokuspokus Fittipus

 
  
    #12559
19.12.24 18:14

63410 Postings, 7299 Tage LibudaLöschung

 
  
    #12560
19.12.24 21:56

Moderation
Zeitpunkt: 20.12.24 16:10
Aktion: Löschung des Beitrages
Kommentar: Fehlender Bezug zum Threadthema

 

 

63410 Postings, 7299 Tage LibudaLöschung

 
  
    #12561
20.12.24 02:52

Moderation
Zeitpunkt: 21.12.24 14:50
Aktionen: Löschung des Beitrages, Nutzer-Sperre für 1 Tag
Kommentar: Werbung

 

 

63410 Postings, 7299 Tage LibudaRakuten AI is unlocking unlimited potential

 
  
    #12562
20.12.24 06:51
Inside Rakuten AI Episode 1: Ting Cai on AI Innovation and Responsibility

In this exciting first episode, we chat with Ting Cai, Rakuten Group Chief AI & Data Officer, who shares how Rakuten AI is unlocking unlimited potential for smarter, more impactful solutions for businesses and users alike. Cai also shares why responsible AI is so important and how Rakuten ensures our innovation is guided by ethics, transparency and trust. Stay tuned for upcoming episodes, where we’ll sit down with other Rakuten AI leaders to explore how they’re shaping the future of AI innovation!

https://www.youtube.com/watch?v=PMRTvPchwqc
 

63410 Postings, 7299 Tage LibudaSimilar will happen by Rakuten Mobile

 
  
    #12563
20.12.24 10:02

63410 Postings, 7299 Tage LibudaRepeat source of # 12652

 
  
    #12564
20.12.24 14:05

Seite: < 1 | ... | 499 | 500 | 501 | 502 |
>  
   Antwort einfügen - nach oben

  1 Nutzer wurde vom Verfasser von der Diskussion ausgeschlossen: tradeconto