link1s.site

ChatGPT: Explained to Kids(How ChatGPT works)

Chat means chat, and GPT is the acronym for Gene Rate Pre trained Transformer.

Genrative means generation, and its function is to create or produce something new; Pre trained refers to a model of artificial intelligence that is learned from a large amount of textual materials, while Transformer refers to a model of artificial intelligence.

Don't worry about T, just focus on the words G and P.

We mainly use its Generative function to generate various types of content; But we need to know why it can produce various types of content, and the reason lies in P.

Only by learning a large amount of content can we proceed with reproduction.

And this kind of learning actually has limitations, which is very natural. For example, if you have learned a lot of knowledge since childhood, can you guarantee that your answer to a question is completely correct?

Almost impossible, firstly due to the limitations of knowledge, ChatGPT is no exception, as it is impossible to master all knowledge; The second is the accuracy of knowledge, how to ensure that all knowledge is accurate and error free; The third aspect is the complexity of knowledge, where the same concept is manifested differently in different contexts, making it difficult for even humans to grasp it perfectly, let alone AI.

So when we use ChatGPT, we also need to monitor the accuracy of the output content of ChatGPT. It is likely not a problem, but if you want to use it on critical issues, you will need to manually review it again.

And now ChatGPT has actually been upgraded twice, one is GPT4 with more accurate answering ability, and the other is the recent GPT Turbo.

The current ChatGPT is a large model called multimodality, which differs from the first generation in that it can not only receive and output text, but also other types of input, such as images, documents, videos, etc. The output is also more diverse. In addition to text, it can also output images or files, and so on.

Doctors visited the White House 8 times? White House: Biden did not receive treatment for Parkinson's disease
White House spokeswoman Karina Jean-Pierre denied a report in the U.S. media on the 8th that President Joseph Biden did not receive treatment for Parkinson's disease. Biden had the first televised debate of the 2024 presidential election with Republican opponent Donald Trump on June 27, and his poor performance on the spot triggered discussions about his physical condition. The New York Times reported that a doctor specializing in the treatment of Parkinson's disease had "visited" the White House eight times from August last year to March this year. Facing the media's questions about Biden's health, Jean-Pierre asked and answered himself at a regular White House press conference on the 8th: "Has the president received treatment for Parkinson's disease? No. Is he currently receiving treatment for Parkinson's disease? No, he is not. Is he taking medication for Parkinson's disease? No." Jean-Pierre said Biden had seen a neurologist three times, all related to his annual physical examination. She also took out the report issued by the doctor after Biden's most recent physical examination in February this year. The report said, "An extremely detailed neurological examination was once again reassuring" because no symptoms consistent with stroke, multiple sclerosis or Parkinson's disease were found. The doctor who went to the White House mentioned by the New York Times is Kevin Kanal, a neurology and movement disorder expert at the Walter Reed National Military Medical Center in Maryland and an authority on Parkinson's disease. Jean-Pierre suggested that the doctor might have come to treat military personnel on duty at the White House.
Stanford AI project team apologizes for plagiarizing Chinese model
An artificial intelligence (AI) team at Stanford University apologized for plagiarizing a large language model (LLM) from a Chinese AI company, which became a trending topic on the Chinese social media platforms, where it sparked concern among netizens on Tuesday. We apologize to the authors of MiniCPM [the AI model developed by a Chinese company] for any inconvenience that we caused for not doing the full diligence to verify and peer review the novelty of this work, the multimodal AI model Llama3-V's developers wrote in a post on social platform X. The apology came after the team from Stanford University announced Llama3-V on May 29, claiming it had comparable performance to GPT4-V and other models with the capability to train for less than $500. According to media reports, the announcement published by one of the team members quickly received more than 300,000 views. However, some netizens from X found and listed evidence of how the Llama3-V project code was reformatted and similar to MiniCPM-Llama3-V 2.5, an LLM developed by a Chinese technology company, ModelBest, and Tsinghua University. Two team members, Aksh Garg and Siddharth Sharma, reposted a netizen's query and apologized on Monday, while claiming that their role was to promote the model on Medium and X (formerly Twitter), and that they had been unable to contact the member who wrote the code for the project. They looked at recent papers to validate the novelty of the work but had not been informed of or were aware of any of the work by Open Lab for Big Model Base, which was founded by the Natural Language Processing Lab at Tsinghua University and ModelBest, according to their responses. They noted that they have taken all references to Llama3-V down in respect to the original work. In response, Liu Zhiyuan, chief scientist at ModelBest, spoke out on the Chinese social media platform Zhihu, saying that the Llama3-V team failed to comply with open-source protocols for respecting and honoring the achievements of previous researchers, thus seriously undermining the cornerstone of open-source sharing. According to a screenshot leaked online, Li Dahai, CEO of ModelBest, also made a post on his WeChat moment, saying that the two models were verified to have highly similarity in terms of providing answers and even the same errors, and that some relevant data had not yet been released to the public. He said the team hopes that their work will receive more attention and recognition, but not in this way. He also called for an open, cooperative and trusting community environment. Director of the Stanford Artificial Intelligence Laboratory Christopher Manning also responded to Garg's explanation on Sunday, commenting "How not to own your mistakes!" on X. As the incident became a trending topic on Sina Weibo, Chinese netizens commented that academic research should be factual, but the incident also proves that the technology development in China is progressing. Global Times
Former British PM Sunak appoints Conservative Party shadow cabinet
On July 8, local time, former British Prime Minister Sunak announced the appointment of the Conservative Party Shadow Cabinet, which is the first shadow cabinet of the Conservative Party in 14 years. Several former British cabinet members during Sunak's tenure as prime minister were appointed to the Conservative Party Shadow Cabinet, including James Cleverly as Shadow Home Secretary and Jeremy Hunt as Shadow Chancellor of the Exchequer. But former Foreign Secretary Cameron was not appointed as Shadow Foreign Secretary. In addition, the new leader of the Conservative Party will be elected as early as this week. On July 4, the UK held a parliamentary election. The counting results showed that the British Labour Party won more than half of the seats and won an overwhelming victory; the Conservative Party suffered a disastrous defeat, ending its 14-year continuous rule.
See Pregnant Margot Robbie Debut Her Baby Bump
This Barbie is going to be a mother. And Margot Robbie has no problem putting her burgeoning baby bump on full display. In fact, the Barbie star, who is pregnant with her Tom Ackerley’s first baby, debuted recently her bump while vacationing on Italy’s Lake Como with her husband July 7. For the outing, Margot donned a black blazer over a white tee that was cropped above her stomach, showing off a sweet baby bump. She finished off the look with low-rise black trousers, black platform sandals and a summery straw bag. For his part, Tom—whom Margot wed in a 2016 ceremony in her native Australia—wore olive green trousers and a cream-colored button-down shirt and tan sneakers. The couple were photographed waiting on a dock in Lake Como before they hopped in a boat and sailed off into a literal sunset. While Margot and Tom, both 34, haven’t spoken publicly about their upcoming bundle of joy, the I, Tonya alum has previously expressed hope to have a big family one day. As she told Porter in 2018, “If I'm looking into my future 30 years from now, I want to see a big Christmas dinner with tons of kids there.” Tom and Margot’s new chapter comes over ten years after their love story first began on the set of 2014's Suite Française, in which Margot starred while Tom worked as a third assistant director. But while she was immediately smitten, Margot was convinced her love would go unrequited. "I was always in love with him, but I thought, ‘Oh, he would never love me back,'" she admitted to Vogue in 2016. "'Don't make it weird, Margot. Don't be stupid and tell him that you like him.' And then it happened, and I was like, ‘Of course we're together. This makes so much sense, the way nothing has ever made sense before.'"
Avi Bruce appointed as head of IDF Central Command
On the evening of July 8, local time, the Israel Defense Forces issued a statement saying that Major General Avi Bluth replaced Yehuda Fox as the commander of the Israeli Central Command. Earlier that day, the Israeli army held a handover ceremony, which was presided over by the Israeli Chief of Staff Halevy. Avi Bluth joined the Israel Defense Forces in 1993 and commanded the Israeli military operations in the West Bank. In May this year, Bruce was promoted to major general and served as a military commander in the Israeli Central Command. CCTV reporters learned that in late April this year, Yehuda Fox, then commander of the Israeli Central Command, requested to resign and retire from the army in August this year. Fox had previously stated that he should bear part of the responsibility for the military intelligence failure on October 7 last year, and "must end his term like everyone else." According to the official website of the Israeli Defense Forces, the Central Command is one of the four major commands of the Israeli army, headquartered in Jerusalem, and its responsibility covers nearly one-third of Israel's territory.