Currently, the world is producing 16.3 zettabytes of data a year. According to IDC, by 2025 that amount will rise tenfold, to 163 zettabytes a year. But how big, exactly, is a zetta?
According to IDC analysis, 16,3 zettabytes of data were created worldwide in 2017. Such volume shows why keeping an eye on big data business trends is so crucial.
Harnessing the power of AI’s image recognition and deep learning may significantly reduce the cost of visual quality control.
CodiLime ranks 49th on the FT 1000: the complete list of Europe’s 1000 fastest growing companies. The company outperformed more than 950 companies from throughout the continent.
The tailored Team Training Tracks (4T) method is a way of teaching data science by working with a specific team on selected real-life projects in a company. It facilitates an effective learning experience for adults addressing an increasing job market need for data scientists.
How we teach data science now has many limitations. Universities, bootcamps and online courses have yet to provide an optimal learning experience that answers job market needs. A fourth way is needed.
The number of data scientists has grown over 650% over the past five years. Machine learning and deep learning skills are in huge demand at present, and the list of reasons you should join those already working in the profession is broad.
The company joins the prestigious NVIDIA Service Delivery Partner program as one of a handful of preferred partners worldwide providing professional services in deep learning.
The National Science Centre in Poland has granted deepsense.ai data scientist and University of Warsaw researcher, Piotr Miłoś, 500,000 USD. The project “Reinforcement Learning – contemporary challenges” is designed to help resolve long-standing open problems in training robots and game agents in difficult environments.
deepsense.ai ML team has been working with Google Brain on helping AI imagine and reason about the future. They started from optimizing TensorFlow’s infrastructure for reinforcement learning and moved to end-to-end training of AI entirely on Google’s newest Cloud TPUs.