hacklink hack forum hacklink film izle hacklink marsbahisizmir escortsahabetpornJojobetcasibompadişahbetBakırköy Escortcasibom9018betgit casinojojobet

Tag: kill

  • In Cyprus, Ukrainians learn how to dispose of landmines that kill and maim hundreds

    In Cyprus, Ukrainians learn how to dispose of landmines that kill and maim hundreds

    NICOSIA, Cyprus — In a Cypriot National Guard camp, Ukrainians are being trained on how to identify, locate and dispose of landmines and other unexploded munitions that litter huge swaths of their country, killing and maiming hundreds of people, including children.

    Analysts say Ukraine is among the countries that are the most affected by landmines and discarded explosives, as a result of Russia’s ongoing war.

    According to U.N. figures, some 399 people have been killed and 915 wounded from landmines and other munitions since Russia’s full-scale invasion of Ukraine on Feb. 24, 2022, equal to the number of casualties reported from 2014-2021. More than 1 in 10 of those casualties have been children.

    The economic impact is costing billions to the Ukrainian economy. Landmines and other munitions are preventing the sowing of 5 million hectares, or 10%, of the country’s agricultural land.

    Cyprus stepped up to offer its facilities as part of the European Union’s Military Assistance Mission to Ukraine. So far, almost 100 Ukrainian armed forces personnel have taken part in three training cycles over the last two years, said Cyprus Foreign Ministry spokesperson Theodoros Gotsis.

    “We are committed to continuing this support for as long as it takes,” Gotsis told the Associated Press, adding that the Cyprus government has covered the 250,000 euro ($262,600) training cost.

    Cyprus opted to offer such training owing to its own landmine issues dating back five decades when the island nation was ethnically divided when Turkey invaded following a coup that sought union with Greece. The United Nations has removed some 27,000 landmines from a buffer zone that cuts across the island, but minefields remain on either side. The Cypriot government says it has disposed of all anti-personnel mines in line with its obligations under an international treaty that bans the use of such munitions.

    In Cyprus, Ukrainians undergo rigorous theoretical and practical training over a five-week Basic Demining and Clearance course that includes instruction on distinguishing and safely handling landmines and other explosive munitions, such as rockets, 155 mm artillery shells, rocket-propelled grenades and mortar shells.

    Theoretical training uses inert munitions identical to the actual explosives.

    Most of the course is comprised of hands-on training focusing on the on-site destruction of unexploded munitions using explosives, the chief training officer told the Associated Press. The officer spoke on condition of anonymity because he’s not authorized to disclose his identity for security reasons.

    “They’re trained on ordnance disposal using real explosives,” the officer said. “That will be the trainees’ primary task when they return.”

    Cypriot officials said the Ukrainian trainees did not want to be either interviewed or photographed.

    Defusing discarded munitions or landmines in areas where explosive charges can’t be used — for instance, near a hospital — is not part of this course because that’s the task of highly trained teams of disposal experts whose training can last as long as eight months, the officer said.

    Trainees, divided into groups of eight, are taught how to operate metal detectors and other tools for detecting munitions like prodders — long, thin rods which are used to gently probe beneath the ground’s surface in search of landmines and other explosive ordnance.

    Another tool is a feeler, a rod that’s used to detect booby-trapped munitions. There are many ways to booby-trap such munitions, unlike landmines which require direct pressure to detonate.

    “Booby-trapped munitions are a widespread phenomenon in Ukraine,” the chief training officer explained.

    Training, primarily conducted by experts from other European Union countries, takes place both in forested and urban areas at different army camps and follows strict safety protocols.

    The short, intense training period keeps the Ukrainians focused.

    “You see the interest they show during instruction: they ask questions, they want to know what mistakes they’ve made and the correct way of doing it,” the officer said.

    Humanitarian data and analysis group ACAPS said in a Jan. 2024 report that 174,000 sq. kilometers (67,182 sq. miles) or nearly 29% of Ukraine’s territory needs to be surveyed for landmines and other explosive ordnance.

    More than 10 million people are said to live in areas where demining action is needed.

    Since 2022, Russian forces have used at least 13 types of anti-personnel mines, which target people. Russia never signed the 1997 Ottawa Convention banning the use of anti-personnel mines, but the use of such mines is nonetheless considered a violation of its obligations under international law.

    Russia also uses 13 types of anti-tank mines.

    The International Campaign to Ban Landmines said in its 2023 Landmine Monitor report that Ukrainian government forces may have also used antipersonnel landmines in contravention of the Mine Ban Treaty in and around the city of Izium during 2022, when the city was under Russian control.

    Source link

  • An AI chatbot pushed a teen to kill himself, a lawsuit against its creator alleges

    An AI chatbot pushed a teen to kill himself, a lawsuit against its creator alleges

    TALLAHASSEE, Fla. — In the final moments before he took his own life, 14-year-old Sewell Setzer III took out his phone and messaged the chatbot that had become his closest friend.

    For months, Sewell had become increasingly isolated from his real life as he engaged in highly sexualized conversations with the bot, according to a wrongful death lawsuit filed in a federal court in Orlando this week.

    The legal filing states that the teen openly discussed his suicidal thoughts and shared his wishes for a pain-free death with the bot, named after the fictional character Daenerys Targaryen from the television show “Game of Thrones.”

    EDITOR’S NOTE — This story includes discussion of suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988.

    On Feb. 28, Sewell told the bot he was ‘coming home’ — and it encouraged him to do so, the lawsuit says.

    “I promise I will come home to you. I love you so much, Dany,” Sewell told the chatbot.

    “I love you too,” the bot replied. “Please come home to me as soon as possible, my love.”

    “What if I told you I could come home right now?” he asked.

    “Please do, my sweet king,” the bot messaged back.

    Just seconds after the Character.AI bot told him to “come home,” the teen took his own life, according to the lawsuit, filed this week by Sewell’s mother, Megan Garcia, of Orlando, against Character Technologies Inc.

    Charter Technologies is the company behind Character.AI, an app that allows users to create customizable characters or interact with those generated by others, spanning experiences from imaginative play to mock job interviews. The company says the artificial personas are designed to “feel alive” and “human-like.”

    “Imagine speaking to super intelligent and life-like chat bot Characters that hear you, understand you and remember you,” reads a description for the app on Google Play. “We encourage you to push the frontier of what’s possible with this innovative technology.”

    Garcia’s attorneys allege the company engineered a highly addictive and dangerous product targeted specifically to kids, “actively exploiting and abusing those children as a matter of product design,” and pulling Sewell into an emotionally and sexually abusive relationship that led to his suicide.

    “We believe that if Sewell Setzer had not been on Character.AI, he would be alive today,” said Matthew Bergman, founder of the Social Media Victims Law Center, which is representing Garcia.

    A spokesperson for Character.AI said Friday that the company doesn’t comment on pending litigation. In a blog post published the day the lawsuit was filed, the platform announced new “community safety updates,” including guardrails for children and suicide prevention resources.

    “We are creating a different experience for users under 18 that includes a more stringent model to reduce the likelihood of encountering sensitive or suggestive content,” the company said in a statement to The Associated Press. “We are working quickly to implement those changes for younger users.”

    Google and its parent company, Alphabet, have also been named as defendants in the lawsuit. The AP left multiple email messages with the companies on Friday.

    In the months leading up to his death, Garcia’s lawsuit says, Sewell felt he had fallen in love with the bot.

    While unhealthy attachments to AI chatbots can cause problems for adults, for young people it can be even riskier — as with social media — because their brain is not fully developed when it comes to things like impulse control and understanding the consequences of their actions, experts say.

    James Steyer, the founder and CEO of the nonprofit Common Sense Media, said the lawsuit “underscores the growing influence — and severe harm — that generative AI chatbot companions can have on the lives of young people when there are no guardrails in place.”

    Kids’ overreliance on AI companions, he added, can have significant effects on grades, friends, sleep and stress, “all the way up to the extreme tragedy in this case.”

    “This lawsuit serves as a wake-up call for parents, who should be vigilant about how their children interact with these technologies,” Steyer said.

    Common Sense Media, which issues guides for parents and educators on responsible technology use, says it is critical that parents talk openly to their kids about the risks of AI chatbots and monitor their interactions.

    “Chatbots are not licensed therapists or best friends, even though that’s how they are packaged and marketed, and parents should be cautious of letting their children place too much trust in them,” Steyer said.

    ___

    Associated Press reporter Barbara Ortutay in San Francisco contributed to this report. Kate Payne is a corps member for The Associated Press/Report for America Statehouse News Initiative. Report for America is a nonprofit national service program that places journalists in local newsrooms to report on undercovered issues.

    Source link

  • Sports Insider: Sanzaar is dead as New Zealand and South African greed and self-interest kill off the Rugby Championship

    Sports Insider: Sanzaar is dead as New Zealand and South African greed and self-interest kill off the Rugby Championship

    Source link