hacklink hack forum hacklink film izle hacklink marsbahisizmir escortsahabetpornJojobetcasibompadişahbetGorabetcasibom9018betgit casinojojobetmarsbahismatbetmatbet giriş

Tag: NVIDIA

  • Nvidia beats earnings expectations as investors eye demand for Blackwell AI chips

    Nvidia beats earnings expectations as investors eye demand for Blackwell AI chips

    LOS ANGELES — Nvidia on Wednesday reported a surge in third-quarter profit and sales as demand for its specialized computer chips that power artificial intelligence systems remains robust.

    For the three months that ended Oct. 27, the tech giant based in Santa Clara, California, posted revenue of $35.08 billion, up 94% from $18.12 billion a year ago.

    Nvidia said it earned $19.31 billion in the quarter, more than double the $9.24 billion it posted in last year’s third quarter. Adjusted for one-time items, it earned 81 cents a share.

    Wall Street analysts had been expecting adjusted earnings of 75 cents a share on revenue of $33.17 billion, according to FactSet.

    Investors took the results in stride, however, and Nvidia’s high-flying stock slipped about 1% in after-hours trading. Shares in Nvidia Corp. are up 195% so far this year.

    “The age of AI is in full steam, propelling a global shift to Nvidia computing,” Jensen Huang, founder and CEO of Nvidia, said in a statement.

    Analysts’ were eyeing Nvidia’s guidance on its Blackwell graphics processor unit, a next-generation artificial intelligence chip that’s seen demand from companies like OpenAI and others building AI data centers. Over the summer, the tech juggernaut said it would increase production of its Blackwell AI chips beginning in the fourth quarter and continuing through fiscal 2026.

    Huang said in an interview with CNBC last month that demand for Blackwell is “insane.”

    “Everybody wants to have the most and everybody wants to be first,” Huang said.

    Nvidia has led the artificial intelligence sector to become one of the stock market’s biggest companies, as tech giants spend heavily on the company’s chips and data centers needed to train and operate their AI systems.

    The company carved out an early lead in AI applications race, in part because of Huang’s successful bet on the chip technology used to fuel the industry. The company is no stranger to big bets. Nvidia’s invention of graphics processor chips, or GPUs, in 1999 helped spark the growth of the PC gaming market and redefined computer graphics.

    Demand for generative AI products that can compose documents, make images and serve as personal assistants has fueled sales of Nvidia’s specialized chips over the last year. Nvidia, the most valuable publicly traded company by market cap as of Wednesday morning, is now worth over $3.5 trillion, with analysts closely monitoring Nvidia’s path to $4 trillion.

    Through the year’s first six months, Nvidia’s stock soared nearly 150%. At that point, the stock was trading at a little more than 100 times the company’s earnings over the prior 12 months. That’s much more expensive than it’s been historically and than the S&P 500 in general.

    Source link

  • Nvidia rivals focus on building a different kind of chip to power AI products

    Nvidia rivals focus on building a different kind of chip to power AI products

    SANTA CLARA, Calif. — Building the current crop of artificial intelligence chatbots has relied on specialized computer chips pioneered by Nvidia, which cornered the market and made itself the poster child of the AI boom.

    But the same qualities that make those graphics processor chips, or GPUs, so effective at creating powerful AI systems from scratch make them less efficient at putting AI products to work.

    That’s opened up the AI chip industry to rivals who think they can compete with Nvidia in selling so-called AI inference chips that are more attuned to the day-to-day running of AI tools and designed to reduce some of the huge computing costs of generative AI.

    “These companies are seeing opportunity for that kind of specialized hardware,” said Jacob Feldgoise, an analyst at Georgetown University’s Center for Security and Emerging Technology. “The broader the adoption of these models, the more compute will be needed for inference and the more demand there will be for inference chips.”

    It takes a lot of computing power to make an AI chatbot. It starts with a process called training or pretraining — the “P” in ChatGPT — that involves AI systems “learning” from the patterns of huge troves of data. GPUs are good at doing that work because they can run many calculations at a time on a network of devices in communication with each other.

    However, once trained, a generative AI tool still needs chips to do the work — such as when you ask a chatbot to compose a document or generate an image. That’s where inferencing comes in. A trained AI model must take in new information and make inferences from what it already knows to produce a response.

    GPUs can do that work, too. But it can be a bit like taking a sledgehammer to crack a nut.

    “With training, you’re doing a lot heavier, a lot more work. With inferencing, that’s a lighter weight,” said Forrester analyst Alvin Nguyen.

    That’s led startups like Cerebras, Groq and d-Matrix as well as Nvidia’s traditional chipmaking rivals — such as AMD and Intel — to pitch more inference-friendly chips as Nvidia focuses on meeting the huge demand from bigger tech companies for its higher-end hardware.

    D-Matrix, which is launching its first product this week, was founded in 2019 — a bit late to the AI chip game, as CEO Sid Sheth explained during a recent interview at the company’s headquarters in Santa Clara, California, the same Silicon Valley city that’s also home to AMD, Intel and Nvidia.

    “There were already 100-plus companies. So when we went out there, the first reaction we got was ‘you’re too late,’” he said. The pandemic’s arrival six months later didn’t help as the tech industry pivoted to a focus on software to serve remote work.

    Now, however, Sheth sees a big market in AI inferencing, comparing that later stage of machine learning to how human beings apply the knowledge they acquired in school.

    “We spent the first 20 years of our lives going to school, educating ourselves. That’s training, right?” he said. “And then the next 40 years of your life, you kind of go out there and apply that knowledge — and then you get rewarded for being efficient.”

    The product, called Corsair, consists of two chips with four chiplets each, made by Taiwan Semiconductor Manufacturing Company — the same manufacturer of most of Nvidia’s chips — and packaged together in a way that helps to keep them cool.

    The chips are designed in Santa Clara, assembled in Taiwan and then tested back in California. Testing is a long process and can take six months — if anything is off, it can be sent back to Taiwan.

    D-Matrix workers were doing final testing on the chips during a recent visit to a laboratory with blue metal desks covered with cables, motherboards and computers, with a cold server room next door.

    While tech giants like Amazon, Google, Meta and Microsoft have been gobbling up the supply of costly GPUs in a race to outdo each other in AI development, makers of AI inference chips are aiming for a broader clientele.

    Forrester’s Nguyen said that could include Fortune 500 companies that want to make use of new generative AI technology without having to build their own AI infrastructure. Sheth said he expects a strong interest in AI video generation.

    “The dream of AI for a lot of these enterprise companies is you can use your own enterprise data,” Nguyen said. “Buying (AI inference chips) should be cheaper than buying the ultimate GPUs from Nvidia and others. But I think there’s going to be a learning curve in terms of integrating it.”

    Feldgoise said that, unlike training-focused chips, AI inference work prioritizes how fast a person will get a chatbot’s response.

    He said another whole set of companies is developing AI hardware for inference that can run not just in big data centers but locally on desktop computers, laptops and phones.

    Better-designed chips could bring down the huge costs of running AI to businesses. That could also affect the environmental and energy costs for everyone else.

    Sheth says the big concern right now is, “are we going to burn the planet down in our quest for what people call AGI — human-like intelligence?”

    It’s still fuzzy when AI might get to the point of artificial general intelligence — predictions range from a few years to decades. But, Sheth notes, only a handful of tech giants are on that quest.

    “But then what about the rest?” he said. “They cannot be put on the same path.”

    The other set of companies don’t want to use very large AI models — it’s too costly and uses too much energy.

    “I don’t know if people truly, really appreciate that inference is actually really going to be a much bigger opportunity than training. I don’t think they appreciate that. It’s still training that is really grabbing all the headlines,” Sheth said.

    Source link

  • Supreme Court seems likely to allow class action to proceed against tech company Nvidia

    Supreme Court seems likely to allow class action to proceed against tech company Nvidia

    WASHINGTON — The Supreme Court on Wednesday seemed likely to keep alive a class-action lawsuit accusing Nvidia of misleading investors about its dependence on selling computer chips for the mining of volatile cryptocurrency.

    The justices heard arguments in the tech company’s appeal of a lower-court ruling allowing a 2018 suit led by a Swedish investment management firm to continue.

    It’s one of two high court cases involving class-action lawsuits against tech companies. Last week, the justices wrestled with whether to shut down a multibillion-dollar class action investors’ lawsuit against Facebook parent Meta stemming from the privacy scandal involving the Cambridge Analytica political consulting firm.

    On Wednesday, a majority of the court that included liberal and conservative justices appeared to reject the arguments advanced by Neal Katyal, the lawyer for Santa Clara, California-based Nvidia.

    “It’s less and less clear why we took this case and why you should win it,” Justice Elena Kagan said.

    The lawsuit followed a dip in the profitability of cryptocurrency, which caused Nvidia’s revenues to fall short of projections and led to a 28% drop in the company’s stock price.

    In 2022, Nvidia paid a $5.5 million fine to settle charges by the Securities and Exchange Commission that it failed to disclose that cryptomining was a significant source of revenue growth from the sale of graphics processing units that were produced and marketed for gaming. The company did not admit to any wrongdoing as part of the settlement.

    Nvidia has led the artificial intelligence sector to become one of the stock market’s biggest companies, as tech giants continue to spend heavily on the company’s chips and data centers needed to train and operate their AI systems.

    That chipmaking dominance has cemented Nvidia’s place as the poster child of the artificial intelligence boom — what CEO Jensen Huang has dubbed “the next industrial revolution.” Demand for generative AI products that can compose documents, make images and serve as personal assistants has fueled sales of Nvidia’s specialized chips over the last year.

    Nvidia is among the most valuable companies in the S&P 500, worth over $3 trillion. The company is set to report its third quarter earnings next week.

    In the Supreme Court case, the company is arguing that the investors’ lawsuit should be thrown out because it does not measure up to a 1995 law, the Private Securities Litigation Reform Act, that is intended to bar frivolous complaints.

    A district court judge had dismissed the complaint before the federal appeals court in San Francisco ruled that it could go forward. The Biden administration is backing the investors.

    A decision is expected by early summer.

    ___

    Associated Press writer Sarah Parvini in Los Angeles contributed to this report

    Source link

  • NVIDIA GeForce RTX AI PCs will get you into the new AI lifestyle

    NVIDIA GeForce RTX AI PCs will get you into the new AI lifestyle

    NVIDIA is stepping into the AI world not threading softly, but all set with big strides and wielding weapons stamped “RTX AI”. 

    Simply put, there are now NVIDIA GeForce RTX AI PCs supercharged with generative AI for gaming, content creation, everyday productivity.

    We welcome the NVIDIA GeForce RTX AI PCs as tools in the peoples’ exploration of the many wonders of the world of Artificial Intelligence.

    NVIDIA is the name behind GeForce GPUs and high-end AI processing hardware, and the renowned global chipmaker boasts of new GeForce graphics hardware that can bring faster, better AI tools to gamers and professionals. NVIDIA is very much on with leveraging the AI capabilities of its GeForce RTX GPUs (mobile and desktop) resulting in new automated and intelligent features.

    NVIDIA CEO Jensen Huang’s keynote at the recent Computex has revealed new GeForce RTX AI laptops. These newly branded laptops, or “GeForce RTX AI Laptops” offer new AI features that run on the gaming GPUs’ Tensor cores.

    Windows Copilot+PCs, for example, get NVIDIA AI boost for features that bring additional set of tricks while relying on the latest RTX GPUs meet more demanding applications. Other new models include the Asus TUF A14 and A16 gaming laptops, the latest Asus ROG Zephyrus G16, the Asus ProArt PX13 and PX16 creator laptops, and the MSI Stealth A16 Studio. These are the notable “RTX AI” laptops while more than 200 RTX AI-branded laptops will arrive from top brands Acer, Dell, Gigabyte, HP, Lenovo, LG, MSI, Razer, and Samsung.

    NVIDIA cites the hardware in these laptops can generate up to 686 trillion operations per second (TOPS) while providing support for AI features in more than 6500 games and apps. NVIDIA also boasts that the new RTX-equipped laptops will deliver seven times faster performance in Stable Diffusion 1.5 image generation. These new laptops will also drive 10 times faster large language model (LLM) performance.

    Seeing a tough but exciting road ahead, the chipmaker makes a brave claim that the NVIDIA GeForce RTX and NVIDIA RTX GPUs are truly built for the era of AI. Powered by NVIDIA’s industry-leading GPU architecture, RTX GPUs feature specialized AI Tensor Cores that deliver up to 1300 AI TOPS of processing power for cutting-edge performance and transformative capabilities in gaming, creating, everyday productivity, plus more.

    Over 600 AI-enabled applications and games are accelerated by RTX AI PCs. To name RTX AI Exclusive features and benefits:

    Productivity

    ChatRTX – Get tailored responses from local files with your own personal private chatbot. Search personal notes, files, and photos with text or voice.

    Video Production

    Enhanced AI Effects – Supercharge your video editing process with AI effects and tools in DaVinci Resolve, Adobe Premiere Pro, Capcut, and more.

    3D Design

    Interactive Design – Real-time viewport rendering, upscaling, ray reconstruction, and more are available in 3D creative apps like Adobe Substance, Adobe Painter, Blender, D5 Render and Unreal Engine. Boost rendering performance with NVIDIA DLSS and OptiX AI technologies.

    Live Streaming

    NVIDIA Broadcast – Level up your live streams with AI-powered Noise Removal, Background Replacement, and more.

    Gaming

    NVIDIA DLSS – Maximize performance and quality in your favorite games. DLSS uses AI to create additional frames and improve image quality.

    Digital Humans and Game NPCs

    NVIDIA ACE – Bring AI game characters and digital humans to life with generative AI.

    Game Modding

    RTX Remix – Easily capture game assets, automatically enhance materials with generative AI tools, and quickly create stunning RTX remasters with full ray tracing and DLSS.

    Digital Art and Design

    NVIDIA Canvas – Turn simple brushstrokes into realistic landscape images for backgrounds, concept exploration, or creative inspiration.

    App Development

    RTX AI Toolkit – Customize, optimize, and deploy AI models for Windows applications with an end-to-end suite of tools and SDKs.

    Image and Video Generation

    Stable Diffusion – Generate images and videos faster on NVIDIA RTX GPUs, accelerated by NVIDIA TensorRT.

    Entertainment

    RTX Video – AI Super Resolution and HDR in Chrome, Edge, and Firefox browsers plus VLC Media Player turn standard internet video into crystal clear 4K HDR media. 

    RTX AI Key Benefits

    Do It All – Create, play, build and elevate your everyday—faster.

    NVIDIA sees importance in all of these features as technology can no longer be mentioned without AI. And NVIDIA is now very much into creating tools and apps that use GeForce RTX GPUs and RTX-accelerated software to make AI more accessible and helpful to as many technology hopefuls as possible.

    AI is no longer an unchartered territory. Get ready to explore and gain from it through NVIDIA GeForce RTX AI PCs.

    Source link