• Police seek suspects in deadly birthday party shooting
  • Lawmakers launch inquires into U.S. boat strike
  • Nov. 29, 2025, 10:07 PM EST / Updated Nov. 30, 2025,…
  • Mark Kelly says troops ‘can tell’ what orders…

Be that!

contact@bethat.ne.com

 

Be That ! Menu   ≡ ╳
  • Home
  • Travel
  • Culture
  • Lifestyle
  • Sport
  • Contact Us
  • Politics Politics
☰

Be that!

Nov. 18, 2025, 5:00 AM ESTBy Keir SimmonsDUBAI, United Arab Emirates — It will work like any other ride-hailing app except that instead of a car, a battery-powered aircraft will swoop in and fly you. Set to launch in Dubai next year, the American company Joby Aviation, Inc. has been developing the technology at Edwards Air Force Base in Texas as well as in the United Arab Emirates, where earlier this month it became the first electric air taxi company to complete a flight in the Middle Eastern country. “It’s an absolutely awesome aircraft to fly,” test pilot Peter Wilson told NBC News on Sunday. “The flight is smooth, the handling qualities are exceptional.” Wilson, who has previously test-flown F-35 fighter jets, said the simple controls on the air taxi are “super safe” as they ensure the pilot has a “low workload while still being able to do all the things they want to do.”

admin - Latest News - November 18, 2025
admin
11 views 15 secs 0 Comments




DUBAI, United Arab Emirates — It will work like any other ride-hailing app except that instead of a car, a battery-powered aircraft will swoop in and fly you.



Source link

TAGS:
PREVIOUS
Savewith a NBCUniversal ProfileCreate your free profile or log in to save this articleNov. 18, 2025, 5:00 AM ESTBy Jared PerloJudge Victoria Kolakowski sensed something was wrong with Exhibit 6C.Submitted by the plaintiffs in a California housing dispute, the video showed a witness whose voice was disjointed and monotone, her face fuzzy and lacking emotion. Every few seconds, the witness would twitch and repeat her expressions.Kolakowski, who serves on California’s Alameda County Superior Court, soon realized why: The video had been produced using generative artificial intelligence. Though the video claimed to feature a real witness — who had appeared in another, authentic piece of evidence — Exhibit 6C was an AI “deepfake,” Kolakowski said.The case, Mendones v. Cushman & Wakefield, Inc., appears to be one of the first instances in which a suspected deepfake was submitted as purportedly authentic evidence in court and detected — a sign, judges and legal experts said, of a much larger threat. Citing the plaintiffs’ use of AI-generated material masquerading as real evidence, Kolakowski dismissed the case on Sept. 9. The plaintiffs sought reconsideration of her decision, arguing the judge suspected but failed to prove that the evidence was AI-generated. Judge Kolakowski denied their request for reconsideration on Nov. 6. The plaintiffs did not respond to a request for comment.With the rise of powerful AI tools, AI-generated content is increasingly finding its way into courts, and some judges are worried that hyperrealistic fake evidence will soon flood their courtrooms and threaten their fact-finding mission. NBC News spoke to five judges and 10 legal experts who warned that the rapid advances in generative AI — now capable of producing convincing fake videos, images, documents and audio — could erode the foundation of trust upon which courtrooms stand. Some judges are trying to raise awareness and calling for action around the issue, but the process is just beginning.“The judiciary in general is aware that big changes are happening and want to understand AI, but I don’t think anybody has figured out the full implications,” Kolakowski told NBC News. “We’re still dealing with a technology in its infancy.”Prior to the Mendones case, courts have repeatedly dealt with a phenomenon billed as the “Liar’s Dividend,” — when plaintiffs and defendants invoke the possibility of generative AI involvement to cast doubt on actual, authentic evidence. But in the Mendones case, the court found the plaintiffs attempted the opposite: to falsely admit AI-generated video as genuine evidence. Judge Stoney Hiljus, who serves in Minnesota’s 10th Judicial District and is chair of the Minnesota Judicial Branch’s AI Response Committee, said the case brings to the fore a growing concern among judges. “I think there are a lot of judges in fear that they’re going to make a decision based on something that’s not real, something AI-generated, and it’s going to have real impacts on someone’s life,” he said.Many judges across the country agree, even those who advocate for the use of AI in court. Judge Scott Schlegel serves on the Fifth Circuit Court of Appeal in Louisiana and is a leading advocate for judicial adoption of AI technology, but he also worries about the risks generative AI poses to the pursuit of truth. “My wife and I have been together for over 30 years, and she has my voice everywhere,” Schlegel said. “She could easily clone my voice on free or inexpensive software to create a threatening message that sounds like it’s from me and walk into any courthouse around the country with that recording.”“The judge will sign that restraining order. They will sign every single time,” said Schlegel, referring to the hypothetical recording. “So you lose your cat, dog, guns, house, you lose everything.”Judge Erica Yew, a member of California’s Santa Clara County Superior Court since 2001, is passionate about AI’s use in the court system and its potential to increase access to justice. Yet she also acknowledged that forged audio could easily lead to a protective order and advocated for more centralized tracking of such incidents. “I am not aware of any repository where courts can report or memorialize their encounters with deep-faked evidence,” Yew told NBC News. “I think AI-generated fake or modified evidence is happening much more frequently than is reported publicly.”Yew said she is concerned that deepfakes could corrupt other, long-trusted methods of obtaining evidence in court. With AI, “someone could easily generate a false record of title and go to the county clerk’s office,” for example, to establish ownership of a car. But the county clerk likely will not have the expertise or time to check the ownership document for authenticity, Yew said, and will instead just enter the document into the official record.“Now a litigant can go get a copy of the document and bring it to court, and a judge will likely admit it. So now do I, as a judge, have to question a source of evidence that has traditionally been reliable?” Yew wondered. Though fraudulent evidence has long been an issue for the courts, Yew said AI could cause an unprecedented expansion of realistic, falsified evidence. “We’re in a whole new frontier,” Yew said.Santa, Calif., Clara County Superior Court Judge Erica Yew.Courtesy of Erica YewSchlegel and Yew are among a small group of judges leading efforts to address the emerging threat of deepfakes in court. They are joined by a consortium of the National Center for State Courts and the Thomson Reuters Institute, which has created resources for judges to address the growing deepfake quandary. The consortium labels deepfakes as “unacknowledged AI evidence” to distinguish these creations from “acknowledged AI evidence” like AI-generated accident reconstruction videos, which are recognized by all parties as AI-generated.Earlier this year, the consortium published a cheat sheet to help judges deal with deepfakes. The document advises judges to ask those providing potentially AI-generated evidence to explain its origin, reveal who had access to the evidence, share whether the evidence had been altered in any way and look for corroborating evidence. In April 2024, a Washington state judge denied a defendant’s efforts to use an AI tool to clarify a video that had been submitted. Beyond this cadre of advocates, judges around the country are starting to take note of AI’s impact on their work, according to Hiljus, the Minnesota judge.“Judges are starting to consider, is this evidence authentic? Has it been modified? Is it just plain old fake? We’ve learned over the last several months, especially with OpenAI’s Sora coming out, that it’s not very difficult to make a really realistic video of someone doing something they never did,” Hiljus said. “I hear from judges who are really concerned about it and who think that they might be seeing AI-generated evidence but don’t know quite how to approach the issue.” Hiljus is currently surveying state judges in Minnesota to better understand how generative AI is showing up in their courtrooms. To address the rise of deepfakes, several judges and legal experts are advocating for changes to judicial rules and guidelines on how attorneys verify their evidence. By law and in concert with the Supreme Court, the U.S. Congress establishes the rules for how evidence is used in lower courts.One proposal crafted by Maura R. Grossman, a research professor of computer science at the University of Waterloo and a practicing lawyer, and Paul Grimm, a professor at Duke Law School and former federal district judge, would require parties alleging that the opposition used deepfakes to thoroughly substantiate their arguments. Another proposal would transfer the duty of deepfake identification from impressionable juries to judges. The proposals were considered by the U.S. Judicial Conference’s Advisory Committee on Evidence Rules when it conferred in May, but they were not approved. Members argued “existing standards of authenticity are up to the task of regulating AI evidence.” The U.S. Judicial Conference is a voting body of 26 federal judges, overseen by the chief justice of the Supreme Court. After a committee recommends a change to judicial rules, the conference votes on the proposal, which is then reviewed by the Supreme Court and voted upon by Congress.Despite opting not to move the rule change forward for now, the committee was eager to keep a deepfake evidence rule “in the bullpen in case the Committee decides to move forward with an AI amendment in the future,” according to committee notes. Grimm was pessimistic about this decision given how quickly the AI ecosystem is evolving. By his accounting, it takes a minimum of three years for a new federal rule on evidence to be adopted.The Trump administration’s AI Action Plan, released in July as the administration’s road map for American AI efforts, highlights the need to “combat synthetic media in the court system” and advocates for exploring deepfake-specific standards similar to the proposed evidence rule changes. Yet other law practitioners think a cautionary approach is wisest, waiting to see how often deepfakes are really passed off as evidence in court and how judges react before moving to update overarching rules of evidence. Jonathan Mayer, the former chief science and technology adviser and chief AI officer at the U.S. Justice Department under President Joe Biden and now a professor at Princeton University, told NBC News he routinely encountered the issue of AI in the court system: “A recurring question was whether effectively addressing AI abuses would require new law, including new statutory authorities or court rules.”“We generally concluded that existing law was sufficient,” he said. However, “the impact of AI could change — and it could change quickly — so we also thought through and prepared for possible scenarios.”In the meantime, attorneys may become the first line of defense against deepfakes invading U.S. courtrooms. Louisiana Fifth Circuit Court of Appeal Judge Scott Schlegel.Courtesy of Scott SchlegelJudge Schlegel pointed to Louisiana’s Act 250, passed earlier this year, as a successful and effective way to change norms about deepfakes at the state level. The act mandates that attorneys exercise “reasonable diligence” to determine if evidence they or their clients submit has been generated by AI. “The courts can’t do it all by themselves,” Schlegel said. “When your client walks in the door and hands you 10 photographs, you should ask them questions. Where did you get these photographs? Did you take them on your phone or a camera?”“If it doesn’t smell right, you need to do a deeper dive before you offer that evidence into court. And if you don’t, then you’re violating your duties as an officer of the court,” he said.Daniel Garrie, co-founder of cybersecurity and digital forensics company Law & Forensics, said that human expertise will have to continue to supplement digital-only efforts. “No tool is perfect, and frequently additional facts become relevant,” Garrie wrote via email. “For example, it may be impossible for a person to have been at a certain location if GPS data shows them elsewhere at the time a photo was purportedly taken.”Metadata — or the invisible descriptive data attached to files that describe facts like the file’s origin, date of creation and date of modification — could be a key defense against deepfakes in the near future. For example, in the Mendones case, the court found the metadata of one of the purportedly-real-but-deepfaked videos showed that the plaintiffs’ video was captured on an iPhone 6, which was impossible given that the plaintiff’s argument required capabilities only available on an iPhone 15 or newer. Courts could also mandate that video- and audio-recording hardware include robust mathematical signatures attesting to the provenance and authenticity of their outputs, allowing courts to verify that content was recorded by actual cameras. Such technological solutions may still run into critical stumbling blocks similar to those that plagued prior legal efforts to adapt to new technologies, like DNA testing or even fingerprint analysis. Parties lacking the latest technical AI and deepfake know-how may face a disadvantage in proving evidence’s origin.Grossman, the University of Waterloo professor, said that for now, judges need to keep their guard up.“Anybody with a device and internet connection can take 10 or 15 seconds of your voice and have a convincing enough tape to call your bank and withdraw money. Generative AI has democratized fraud.”“We’re really moving into a new paradigm,” Grossman said. “Instead of trust but verify, we should be saying: Don’t trust and verify.”Jared PerloJared Perlo is a writer and reporter at NBC News covering AI. He is currently supported by the Tarbell Center for AI Journalism.
NEXT
Nov. 18, 2025, 5:00 AM ESTBy Babak Dehghanpisheh and Monica AlbaWhether viewed as a visionary reformist or a murderous despot, Crown Prince Mohammed bin Salman, Saudi Arabia’s de facto leader, will be taking a huge step toward rejoining the international community when he meets with President Donald Trump at the White House on Tuesday.Bin Salman, 40, became an international pariah after the murder of journalist Jamal Khashoggi, a fierce critic of his government, in 2018, though Trump defended the Saudi government even after the CIA concluded that the crown prince himself ordered the killing.Then-President Joe Biden traveled to Saudi Arabia in 2022 and controversially fist-bumped bin Salman, an image that went viral, at a time when most leaders had shunned the crown prince. Bin Salman said in 2019 that he took “full responsibility” for the Khashoggi killing since it happened on his watch, but denied ordering it. But it is bin Salman’s trip Tuesday, his first during Trump’s second term, that will be seen more broadly as a move toward acceptance back into the diplomatic fold. Trump attends lavish welcome ceremony at Saudi Royal Court00:53“He’s a different kind of figure now. Obviously, the questions about the manner of his rule and internal repression, those things haven’t gone away. But he’s a changed figure; it’s a changed moment. And, I think, important symbolically in that sense,” said Michael Wahid Hanna, the U.S. program director at the International Crisis Group, a global nonprofit organization based in Brussels that works to prevent conflicts.He added: “He’s central to what this administration wants to do in the region.”Trump and bin Salman are expected to sign economic and defense agreements, a White House official told NBC News. Even before bin Salman had set foot in the United States, Trump confirmed at an Oval Office event Monday that he would be willing to sign off on the sale of F-35 stealth fighter jets to the kingdom, a contentious move that could shift the balance of power in the Middle East, where Israel has been the primary recipient of America’s cutting-edge military technology.Trump’s announcement of the sale may not actually lead to Saudi Arabia’s receiving the F-35s anytime soon, analysts say. “The devil will kind of be in the details there,” said Andrew Leber, a nonresident fellow at the Carnegie Endowment for International Peace who has done extensive research on Saudi Arabia, noting that a similar deal announced with the United Arab Emirates fell through. He added, “That deal ultimately ran aground on a combination of U.S. concerns with maintaining Israel’s qualitative military edge and concerns about the extent to which U.S. security technology might leak to China.”The possibility of Saudi Arabia’s normalizing relations with Israel will be a key part of the talks, according to the White House official. The official said Trump “hopes” the kingdom will soon join the Abraham Accords, the 2020 U.S.-brokered agreement that led a number of regional countries to establish formal diplomatic ties with Israel, though analysts are skeptical about a breakthrough.“There’s no near-term horizon for normalization at the moment,” said Hanna of the ICG. “The risks for Mohammed bin Salman are extremely high if he joins the Abraham Accords,” agreed Fawaz Gerges, professor of international relations at the London School of Economics. He noted that the Saudis had made clear they would need some form of Israeli commitment to a path to a Palestinian state — something the Israelis have publicly dismissed. The crown prince’s strategy was generally “to minimize the risks to his rule,” said Gerges.Even if bin Salman does not announce the establishment of diplomatic ties with Israel, he has won favor with Trump as one of the regional leaders who helped pull together the current ceasefire between the Israel Defense Forces and the militants of Hamas. Trump has long touted his deal-making abilities, and, according to a senior administration official, a number of deals are expected to be announced Tuesday, including a multibillion-dollar Saudi investment in America’s artificial intelligence infrastructure, enhanced cooperation on civil nuclear energy and fulfillment of the Saudis’ $600 billion investment pledge via dozens of targeted investments.Critics have raised questions about Trump’s affinity for mixing personal business and diplomacy. His properties have for years hosted tournaments for the Saudi-backed LIV Golf. And The New York Times reported this weekend that the Trump Organization is looking at a huge real estate deal with Saudi Arabia.“There’s some massive ethical questions in here,” said Leber of the Carnegie Endowment. “It’s very obvious that all of the Gulf states have realized that the way you get to Trump is to find some way to enrich his family members, enrich his friends, promise to enrich them down the line.”Governments dealing with Saudi Arabia, human rights groups have long said, should also push the country’s leaders on its dismal human rights record. In August, a report from Human Rights Watch noted an “unprecedented surge” in executions in 2025, with 241 people killed as of Aug. 5. Still, the restrictions on women, another regular criticism leveled at the kingdom, have been eased, and bin Salman has tried to open up the society to Western exports, like Ultimate Fighting Championship matches and comedy shows, though the comedians who recently appeared at a comedy festival in Riyadh, including Louis C.K. and Bill Burr, were blasted for performing there. “This hasn’t been political reform in the sense of creating space for real politics, but he’s absolutely, fundamentally reoriented Saudi society and changed the role of the religious authorities,” said Hanna of the ICG. “There’s incredible social change that has happened partly because he’s operating without any real constraints.”Babak DehghanpishehBabak Dehghanpisheh is an NBC News Digital international editor based in New York.Monica AlbaMonica Alba is a White House correspondent for NBC News.Abigail Williams and Freddie Clayton contributed.
Related Post
October 19, 2025
Savewith a NBCUniversal ProfileCreate your free profile or log in to save this articleOct. 19, 2025, 2:24 AM EDTBy Dennis RomeroRomantic party crasher Domingo made his return to “Saturday Night Live” alongside host and musical guest Sabrina Carpenter as politics took a backseat to pop culture.Carpenter took over hosting duties for the first time and performed hits from her album “Man’s Best Friend” for her double-duty role for the night. The show’s start was delayed because of college football. Domingo, now a recurring character played by Marcello Hernández, returned as Chloe Fineman’s Kelsey once again humiliated husband Matt, played by Andrew Dismukes. Kelsey was focused on Matt’s 30th birthday, with a romantic night out, a table for two and her best friends, the “Kelsquad,” which sang about their recent trip to Nashville with Kelsey — and Domingo.”D, O, M, I, N, G, O, Domingo!” they sang. Enter Domingo, traveling Lothario, goateed singer, crooning about the night he and Kelsey had in Nashville, which, he noted, triggered a noise complaint. “Kelsey, I’m serious, this is strike six!” a frustrated Matt warned.Politics weren’t kept completely out of the show, as President Donald Trump, played by James Austin Johnson, continued a political strategy of appearing on podcasts.This time, Trump took a seat at the table hosted by the “Snack Homiez,” a group of 12-year-old boys — and one 13-year-old “unc” — portrayed by Carpenter and women on the cast. They discussed “GOATed” vegetables and best Halloween candies.”Some vegetables are fire, and some vegetables low-key be a fruit,” Carpenter’s character said on the podcast.Trump was introduced by podcast host Braylor, played by Fineman: “You know him, he’s all over TikTok: President Donald J. Trump.”Trump was asked to weigh in with his favorite vegetable. “I’ve never been one for the veggies,” said Johnson’s Trump. “Ding Dongs. I like a Ding Dong.””We love Little Debbie,” he continued. “She does tremendous work. It’s awful what happened to her.”Johnson’s Trump meandered off-topic in response to a question about his thoughts on Airheads candy. “You know who I do like is George Santos,” he told the boys. “He’s weird. He’s a liar. I think he’s great. We don’t know anything about him. He’s one of our favorite people. I don’t know him at all. I don’t know anything about him.”Trump on Friday commuted the sentence of the former U.S. representative, who served only a few months of a more than seven-year sentence for wire fraud and aggravated identity theft. Johnson’s Trump referenced Santos’ commutation as he spoke about the day’s nationwide No Kings protests.”The people are marching because they’re happy he’s free, right?” he said of Santos. “It’s a ‘Yes, King’ march.” “So, maybe if I think about it, blue Airhead,” Johnson’s Trump finally concluded. “SNL” airs on NBC, a division of NBCUniversal, which is also the parent company of NBC News.Dennis RomeroDennis Romero is a breaking news reporter for NBC News Digital.
September 27, 2025
Protest in Des Moines after ICE detains superintendent
November 7, 2025
'It's been awful': Passengers experience rough travels amid FAA flight disruptions
October 7, 2025
Palestinians in Gaza reflect on two years of war
Comments are closed.
Scroll To Top
  • Home
  • Travel
  • Culture
  • Lifestyle
  • Sport
  • Contact Us
  • Politics
© Copyright 2025 - Be That ! . All Rights Reserved