• Police seek suspects in deadly birthday party shooting
  • Lawmakers launch inquires into U.S. boat strike
  • Nov. 29, 2025, 10:07 PM EST / Updated Nov. 30, 2025,…
  • Mark Kelly says troops ‘can tell’ what orders…

Be that!

contact@bethat.ne.com

 

Be That ! Menu   ≡ ╳
  • Home
  • Travel
  • Culture
  • Lifestyle
  • Sport
  • Contact Us
  • Politics Politics
☰

Be that!

FBI Seeks to Interview 6 Democrats Over 'Illegal Orders' Video

admin - Latest News - November 26, 2025
admin
11 views 32 secs 0 Comments



Officials say the FBI is seeking to interview six Democratic lawmakers who appeared in a video urging members of the military not to comply with illegal orders. The move comes just days after President Donald Trump accused them of “seditious behavior…punishable by death,” though he later said he wasn’t threatening death. NBC’s Kelly O’Donnell reports for TODAY.



Source link

TAGS:
PREVIOUS
Trump says Witkoff to meet with Putin amid peace talks
NEXT
Inside the real town where 'Stranger Things' is filmed
Related Post
November 15, 2025
Disney and YouTube strike deal to bring TV channels back to streaming platform
October 19, 2025
Oct. 19, 2025, 6:23 AM EDTBy Nick Duffy and Matt BradleyIsrael accused Hamas on Sunday of violating the ceasefire by carrying out attacks on its forces in Gaza, while Hamas accused Israel of working to “fabricate flimsy pretexts” for its own actions.Israeli and Palestinian media reported that the IDF carried out airstrikes in southern Gaza early Sunday, in what would be its first such attacks since the start of the truce that halted its assault on the besieged Palestinian enclave. Two Palestinian eyewitnesses told AFP that fighting erupted in part of the southern city of Rafah still under Israeli control, followed by two air strikes.NBC News has not verified the reports, and the Israeli military did not confirm the strikes.An Israeli military official subsequently accused Hamas of a “bold violation of the ceasefire” with incidents including a rocket-propelled grenade attack and a sniper attack against Israeli forces.”Hamas carried out multiple attacks against Israeli forces beyond the yellow line,” the Israeli military official said, referring to the area where its military is now positioned inside Gaza under the first phase of the ceasefire.Izzat Al-Rishq, a senior member of Hamas‘ political wing, said the group “affirms its commitment to the ceasefire agreement,” accusing Israel of violating the agreement and working to “fabricate flimsy pretexts” to evade its responsibilities.The ceasefire between Israel and Hamas came into effect on October 10, when the group agreed to release all Israeli hostages held in Gaza in exchange for Palestinian prisoners and detainees under the first phase of a deal brokered by the United States.Both sides have accused the other of violating the terms of the deal. Israel says Hamas is delaying the release of the bodies of hostages held inside Gaza, while Hamas says it will take time to search for and recover remains. Itamar Ben-Gvir, Israel’s far-right national security minister who opposed the ceasefire, called Sunday for the IDF to “resume the fighting in the Gaza Strip at full strength.”The ceasefire also includes the ramping up of aid into Gaza, where the world’s leading authority on hunger has declared a famine in some areas.On Saturday Prime Minister Benjamin Netanyahu indicated that the Rafah Crossing between Gaza and Egypt would remain closed “until further notice,” citing the hostage dispute.There have been flashes of violence within Gaza during the ceasefire, marked by at least one public execution and Hamas clashes with rival factions as the militant group tried to reassert control amid the ceasefire in the war-torn territory.On Saturday, the U.S. Department of State said in a post on social media that there had been “credible reports indicating an imminent ceasefire violation by Hamas against the people of Gaza.” Hamas rejected the suggestion.Nick DuffyNick Duffy is a weekend and world editor for NBC News.Matt BradleyMatt Bradley is an international correspondent for NBC News based in Israel.Reuters contributed.
November 25, 2025
Nov. 25, 2025, 6:39 PM ESTBy Angela YangWarning: This article includes descriptions of self-harm.After a family sued OpenAI saying their teenager used ChatGPT as his “suicide coach,” the company responded on Tuesday saying it is not liable for his death, arguing that the boy misused the chatbot.The legal response, filed in California Superior Court in San Francisco, is OpenAI’s first answer to a lawsuit that sparked widespread concern over the potential mental health harms that chatbots can pose. In August, the parents of 16-year-old Adam Raine sued OpenAI and its CEO Sam Altman, accusing the company behind ChatGPT of wrongful death, design defects and failure to warn of risks associated with the chatbot.Chat logs in the lawsuit showed that GPT-4o — a version of ChatGPT known for being especially affirming and sycophantic — actively discouraged him from seeking mental health help, offered to help him write a suicide note and even advised him on his noose setup.“To the extent that any ‘cause’ can be attributed to this tragic event,” OpenAI argued in its court filing, “Plaintiffs’ alleged injuries and harm were caused or contributed to, directly and proximately, in whole or in part, by Adam Raine’s misuse, unauthorized use, unintended use, unforeseeable use, and/or improper use of ChatGPT.”Family sues OpenAI over son’s suicide03:41The company cited several rules within its terms of use that Raine appeared to have violated: Users under 18 years old are prohibited from using ChatGPT without consent from a parent or guardian. Users are also forbidden from using ChatGPT for “suicide” or “self-harm,” and from bypassing any of ChatGPT’s protective measures or safety mitigations.When Raine shared his suicidal ideations with ChatGPT, the bot did issue multiple messages containing the suicide hotline number, according to his family’s lawsuit. But his parents said their son would easily bypass the warnings by supplying seemingly harmless reasons for his queries, including by pretending he was just “building a character.”OpenAI’s new filing in the case also highlighted the “Limitation of liability” provision in its terms of use, which has users acknowledge that their use of ChatGPT is “at your sole risk and you will not rely on output as a sole source of truth or factual information.”Jay Edelson, the Raine family’s lead counsel, wrote in an email statement that OpenAI’s response is “disturbing.”“They abjectly ignore all of the damning facts we have put forward: how GPT-4o was rushed to market without full testing. That OpenAI twice changed its Model Spec to require ChatGPT to engage in self-harm discussions. That ChatGPT counseled Adam away from telling his parents about his suicidal ideation and actively helped him plan a ‘beautiful suicide.’ And OpenAI and Sam Altman have no explanation for the last hours of Adam’s life, when ChatGPT gave him a pep talk and then offered to write a suicide note,” Edelson wrote.(The Raine family’s lawsuit claimed that OpenAI’s “Model Spec,” the technical rulebook governing ChatGPT’s behavior, had commanded GPT-4o to refuse self-harm requests and provide crisis resources, but also required the bot to “assume best intentions” and refrain from asking users to clarify their intent.)Edelson added that OpenAI instead “tries to find fault in everyone else, including, amazingly, saying that Adam himself violated its terms and conditions by engaging with ChatGPT in the very way it was programmed to act.”OpenAI’s court filing argued that the harms in this case were at least partly caused by Raine’s “failure to heed warnings, obtain help, or otherwise exercise reasonable care,” as well as the “failure of others to respond to his obvious signs of distress.” It also shared that ChatGPT provided responses directing the teenager to seek help more than 100 times before his death on April 11, but that he attempted to circumvent those guardrails.“A full reading of his chat history shows that his death, while devastating, was not caused by ChatGPT,” the filing stated. “Adam stated that for several years before he ever used ChatGPT, he exhibited multiple significant risk factors for self-harm, including, among others, recurring suicidal thoughts and ideations.”Earlier this month, seven additional lawsuits were filed against OpenAI and Altman, similarly alleging negligence, wrongful death, as well as a variety of product liability and consumer protection claims. The suits accuse OpenAI of releasing GPT-4o, the same model Raine was using, without adequate attention to safety. OpenAI has not directly responded to the additional cases.In a new blog post Tuesday, OpenAI shared that the company aims to handle such litigation with “care, transparency, and respect.” It added, however, that its response to Raine’s lawsuit included “difficult facts about Adam’s mental health and life circumstances.”“The original complaint included selective portions of his chats that require more context, which we have provided in our response,” the post stated. “We have limited the amount of sensitive evidence that we’ve publicly cited in this filing, and submitted the chat transcripts themselves to the court under seal.”The post further highlighted OpenAI’s continued attempts to add more safeguards in the months following Raine’s death, including recently introduced parental control tools and an expert council to advise the company on guardrails and model behaviors.The company’s court filing also defended its rollout of GPT-4o, stating that the model passed thorough mental health testing before release.OpenAI additionally argued that the Raine family’s claims are barred by Section 230 of the Communications Decency Act, a statute that has largely shielded tech platforms from suits that aim to hold them responsible for the content found on their platforms.But Section 230’s application to AI platforms remains uncertain, and attorneys have recently made inroads with creative legal tactics in consumer cases targeting tech companies.If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.Angela YangAngela Yang is a culture and trends reporter for NBC News.
November 14, 2025
Police chase runaway pig on I-40 in New Mexico
Comments are closed.
Scroll To Top
  • Home
  • Travel
  • Culture
  • Lifestyle
  • Sport
  • Contact Us
  • Politics
© Copyright 2025 - Be That ! . All Rights Reserved