link1s.site

Stanford AI project team apologizes for plagiarizing Chinese model

An artificial intelligence (AI) team at Stanford University apologized for plagiarizing a large language model (LLM) from a Chinese AI company, which became a trending topic on the Chinese social media platforms, where it sparked concern among netizens on Tuesday.

We apologize to the authors of MiniCPM [the AI model developed by a Chinese company] for any inconvenience that we caused for not doing the full diligence to verify and peer review the novelty of this work, the multimodal AI model Llama3-V's developers wrote in a post on social platform X.

The apology came after the team from Stanford University announced Llama3-V on May 29, claiming it had comparable performance to GPT4-V and other models with the capability to train for less than $500.

According to media reports, the announcement published by one of the team members quickly received more than 300,000 views.

However, some netizens from X found and listed evidence of how the Llama3-V project code was reformatted and similar to MiniCPM-Llama3-V 2.5, an LLM developed by a Chinese technology company, ModelBest, and Tsinghua University.

Two team members, Aksh Garg and Siddharth Sharma, reposted a netizen's query and apologized on Monday, while claiming that their role was to promote the model on Medium and X (formerly Twitter), and that they had been unable to contact the member who wrote the code for the project.

They looked at recent papers to validate the novelty of the work but had not been informed of or were aware of any of the work by Open Lab for Big Model Base, which was founded by the Natural Language Processing Lab at Tsinghua University and ModelBest, according to their responses. They noted that they have taken all references to Llama3-V down in respect to the original work.

In response, Liu Zhiyuan, chief scientist at ModelBest, spoke out on the Chinese social media platform Zhihu, saying that the Llama3-V team failed to comply with open-source protocols for respecting and honoring the achievements of previous researchers, thus seriously undermining the cornerstone of open-source sharing.

According to a screenshot leaked online, Li Dahai, CEO of ModelBest, also made a post on his WeChat moment, saying that the two models were verified to have highly similarity in terms of providing answers and even the same errors, and that some relevant data had not yet been released to the public.

He said the team hopes that their work will receive more attention and recognition, but not in this way. He also called for an open, cooperative and trusting community environment.

Director of the Stanford Artificial Intelligence Laboratory Christopher Manning also responded to Garg's explanation on Sunday, commenting "How not to own your mistakes!" on X.

As the incident became a trending topic on Sina Weibo, Chinese netizens commented that academic research should be factual, but the incident also proves that the technology development in China is progressing.

Global Times

Google extends Linux kernel support to 4 years
According to AndroidAuthority, the Linux kernel used by Android devices is mostly derived from Google's Android Universal Kernel (ACK) branch, which is created from the Android mainline kernel branch when new LTS versions are released upstream. For example, when kernel version 6.6 is announced as the latest LTS release, an ACK branch for Android15-6.6 appears shortly after, with the "android15" in the name referring to the Android version of the kernel (in this case, Android 15). Google maintains its own set of LTS kernel branches for three main reasons. First, Google can integrate upstream features that have not yet been released into the ACK branch by backporting or picking, so as to meet the specific needs of Android. Second, Google can include some features that are being developed upstream in the ACK branch ahead of time, making it available for Android devices as early as possible. Finally, Google can add some vendor or original equipment manufacturer (OEM) features for other Android partners to use. Once created, Google continues to update the ACK branch to include not only bug fixes for Android specific code, but also to integrate the LTS merge content of the upstream kernel branch. For example, the Linux kernel vulnerability disclosed in the July 2024 Android security bulletin will be fixed through these updates. However, it is not easy to distinguish a bug fix from other bug fixes, as a patch that fixes a bug may also accidentally plug a security vulnerability that the submitter did not know about or chose not to disclose. Google does its best to recognize this, but it inevitably misses the mark, resulting in bug fixes for the upstream Linux kernel being released months before Android devices. As a result, Google has been urging Android vendors to regularly update the LTS kernel to avoid being caught off guard by unexpectedly disclosed security vulnerabilities. Clearly, the LTS version of the Linux kernel is critical to the security of Android devices, helping Google and vendors deal with known and unknown security vulnerabilities. The longer the support period, the more timely security updates Google and vendors can provide to devices.
Musk is the billionaire who lost the most money in the first half of 2024: $5 billion a month
At the beginning of this year, Elon Musk had a fortune of $251 billion and could almost single-handedly solve world hunger. However, Tesla's stagnant sales, the endless struggle to buy Twitter, and the volatility of Tesla's stock price meant he lost a lot of money this year. According to Forbes, Musk is the billionaire with the most losses so far this year, with his wealth shrinking at a rate of about $5 billion a month. According to the website, his wealth shrank by more than 10% from the end of 2023 to June 28, 2024. As the website explains: Between December 31, 2023, and June 28, the last day of regular stock market trading for the first half of the year, Musk's net worth fell from $251.3 billion to $221.4 billion, a bigger drop than any other billionaire tracked by Forbes, but Musk remains the richest person on the planet. The main reason for the dip in Musk's pocketbook is that a Delaware judge in January canceled Musk's then-record Tesla compensation package worth $51 billion, which led Forbes to cut the value of the equity award by 50 percent because of uncertainty about whether Musk would receive those stock options. Excluding that bonus, Musk's wealth has remained volatile over the past six months, with the value of his 13 percent stake in Tesla shrinking by about $20 billion as falling profits and car deliveries sent the stock down 20 percent. But that was partly offset by the growth of Musk's stake in his generative artificial intelligence startup xAI to $14.4 billion (Musk also has a roughly $75 billion stake in private aerospace company SpaceX, a $7 billion stake in social media company X, And smaller stakes in other companies, such as brain experimentation startup Neuralink).
Stanford AI project team apologizes for plagiarizing Chinese model
An artificial intelligence (AI) team at Stanford University apologized for plagiarizing a large language model (LLM) from a Chinese AI company, which became a trending topic on the Chinese social media platforms, where it sparked concern among netizens on Tuesday. We apologize to the authors of MiniCPM [the AI model developed by a Chinese company] for any inconvenience that we caused for not doing the full diligence to verify and peer review the novelty of this work, the multimodal AI model Llama3-V's developers wrote in a post on social platform X. The apology came after the team from Stanford University announced Llama3-V on May 29, claiming it had comparable performance to GPT4-V and other models with the capability to train for less than $500. According to media reports, the announcement published by one of the team members quickly received more than 300,000 views. However, some netizens from X found and listed evidence of how the Llama3-V project code was reformatted and similar to MiniCPM-Llama3-V 2.5, an LLM developed by a Chinese technology company, ModelBest, and Tsinghua University. Two team members, Aksh Garg and Siddharth Sharma, reposted a netizen's query and apologized on Monday, while claiming that their role was to promote the model on Medium and X (formerly Twitter), and that they had been unable to contact the member who wrote the code for the project. They looked at recent papers to validate the novelty of the work but had not been informed of or were aware of any of the work by Open Lab for Big Model Base, which was founded by the Natural Language Processing Lab at Tsinghua University and ModelBest, according to their responses. They noted that they have taken all references to Llama3-V down in respect to the original work. In response, Liu Zhiyuan, chief scientist at ModelBest, spoke out on the Chinese social media platform Zhihu, saying that the Llama3-V team failed to comply with open-source protocols for respecting and honoring the achievements of previous researchers, thus seriously undermining the cornerstone of open-source sharing. According to a screenshot leaked online, Li Dahai, CEO of ModelBest, also made a post on his WeChat moment, saying that the two models were verified to have highly similarity in terms of providing answers and even the same errors, and that some relevant data had not yet been released to the public. He said the team hopes that their work will receive more attention and recognition, but not in this way. He also called for an open, cooperative and trusting community environment. Director of the Stanford Artificial Intelligence Laboratory Christopher Manning also responded to Garg's explanation on Sunday, commenting "How not to own your mistakes!" on X. As the incident became a trending topic on Sina Weibo, Chinese netizens commented that academic research should be factual, but the incident also proves that the technology development in China is progressing. Global Times
Porsche AG reports sharp fall in China deliveries
July 9 (Reuters) - German sportscar maker Porsche (P911_p.DE), opens new tab said on Tuesday that global vehicle deliveries were down 7% in the first half of the year compared to the same period in 2023, primarily driven by a 33% year-on-year drop in China. Porsche, majority-owned by Volkswagen (VOWG_p.DE), opens new tab, is highly exposed to the EU-China tariff tensions, with deliveries to China accounting for nearly 20% of global deliveries. An HSBC analyst pointed to weakness in the European car market, saying that "the market is, understandably, worried about China pricing weakness and the prospect of needing to pay dealer compensation." Overall, Porsche delivered 155,945 cars worldwide during the first six months of the year. In North America, deliveries were down 6% year-on-year. Meanwhile, in Porsche’s home market of Germany, deliveries increased by 22% to 20,811 vehicles.
Biden accelerated aging over the past year!
n a recent interview with ABC, US President Joe Biden said he had no intention of dropping out of the race, blaming his poor debate performance on a cold. He also insisted he was "still in good shape" and would remain in the race, saying only "Almighty God" could pull him out. An insider who has worked with Mr. Biden for a long time said that signs of aging had become apparent over the past year, but that Mr. Biden's team had failed to address it. Biden's televised debate performance heightened concerns about an already slow-moving issue. Mr. Biden's advisers have long dodged questions about his age. But now they acknowledge that Biden's aging is an undeniable fact. The debate forced the president to more openly acknowledge the limitations of his age, which he had previously largely dismissed. But they have only taken superficial measures and have not fundamentally solved the problem. They replaced the long staircase that Mr. Biden used to board Air Force One with a shorter one; Assistants often accompanied him in public to make his stiff gait less noticeable; While he has a busy schedule, aides have arranged for buffer time, such as long weekends at his homes in Wilmington and Rehoboth Beach, Delaware, or extended stays at Camp David, a Maryland resort, to rest after a "grueling" stretch of travel. Under the authority of one of his top advisers, Anita Dunn, Mr. Biden's public interactions -- especially with reporters -- were severely limited. Even at major events with Democrats or other supporters, the White House sometimes limits the amount of time Biden can spend with the audience, two people familiar with the matter said. As a protective response, designed to protect their longtime boss.