According to what NVIDIA bought an artificial data company. So what are the artificial data?

Photo of author

By [email protected]


The Chipmaker Nvidia maker tends to productive tools for the developers Wireless report Wednesday.

The move comes at a time when artificial intelligence companies are struggling to find enough data to train and improve their models, which increases the need to create data.

According to the report, Gretel employees will be folded in NVIDIA. Gretel, which produces artificial or simulators for training in the artificial intelligence model, will enhance NVIDIA offers for artificial intelligence developers.

A NVIDIA spokesman refused to comment on the report.

Why are artificial data concern?

Training of artificial intelligence models such as Chatgpt from Openai, a large language model, requires a lot of data. Real world data can form problems for artificial intelligence developers-that is, they may be noisy, and there is not enough.

Artificial intelligence companies are working on the maximum training data available to them freely, which leads to a conflict over whether they can use copyright content. Hundreds of actors, writers and directors I presented an open message To the Trump Administration Department of Science and Technology Office to increase their concerns about the use of copyright data. Currently, Openai provides the government’s petition Allow more access to copyright materials To train artificial intelligence models, or that other American companies will fail from China.

Watch this: Watch Keynote GTC 2025 Nvidia: All outstanding points in 16 minutes

Artificial data also has a value in protecting private information. Gretil says his artificial data It can be used to train models and tools without exposing sensitive or personal information – for example, health care data that does not determine individuals and may violate privacy laws.

there Fears about the use of this data In typical training. In fact, excessive dependence on information can increase the possibility that the model is mistaken in matters. If the problem becomes bad enough, it may cause a problem known as the collapse of the model, when the model becomes so inaccurate that it becomes useless.





https://www.cnet.com/a/img/resize/504cdce708214bbb0b910056d33e05cf340ae0dd/hub/2025/03/19/0775aeca-48bc-45db-b3d4-564569b08eae/gettyimages-2205306021.jpg?auto=webp&fit=crop&height=675&width=1200

Source link

Leave a Comment