9 Best Articles in 2021
github.com
GitHub - jwilm/alacritty: A cross-platform, GPU enhanced terminal emulator
github.com
45 saves · From 2017 · alacritty - A cross-platform, GPU enhanced terminal emulator
gpu.rocks
gpu.js - GPU Accelerated JavaScript
gpu.rocks
1 min read · 30 saves · From 2017 · Made under 24 hours for a NUS Hackers hackathon, out of 68 competing projects.
Mission.org
Why building your own Deep Learning computer is 10x cheaper than AWS
Mission.org
10 min read · 79 saves · From 2018 · If you’ve used, or are considering, AWS/Azure/GCloud for Machine Learning, you know how crazy expensive GPU time is. And turning machines…
Smashing Magazine
GPU Animation: Doing It Right
Smashing Magazine
20+ min read · 47 saves · From 2016 · Most people now know that modern web browsers use the GPU to render parts of web pages, especially ones with animation. For example, a CSS animation using the transform property looks much smoother than one using the left and top properties. But if you ask, “How do I get smooth animation from the GPU?” in most cases, you’ll hear something like, “Use transform: translateZ(0) or will-change: transform.” These properties have become something like how we used zoom: 1 for Internet Explorer 6 (if
timdettmers.com
Which GPU(s) to Get for Deep Learning
timdettmers.com
11 saves · From 2017 · You want a cheap high performance GPU for deep learning? In this blog post I will guide through the choices, so you can find the GPU which is best for you.
devblogs.nvidia.com
Inside Volta: The World’s Most Advanced Data Center GPU
devblogs.nvidia.com
~20 min read · 22 saves · From 2017 · NVIDIA Tesla V100 is the most advanced data center GPU ever built to accelerate AI, HPC, and Graphics. This post details the Volta GPU architecture.
nextjournal.com
gpu.rocks
GPU Accelerated JavaScript
gpu.rocks
37 saves · From 2017 · Made under 24 hours for a NUS Hackers hackathon, out of 68 competing projects.
timdettmers.com
Which GPU(s) to Get for Deep Learning
timdettmers.com
20 saves · 2020-09-08 · You want a cheap high performance GPU for deep learning? In this blog post I will guide through the choices, so you can find the GPU which is best for you.
Trending
The Verge
How to watch AMD’s event introducing the next Radeon RX 6000 GPU
The Verge
1 min read · Mar 3rd · The next GPU in the Radeon RX 6000 series will be announced today.
VICE
What Makes a GPU a GPU, and When Did We Start Calling it That?
VICE
8 min read · 10 saves · Mar 31st · Turns out that’s a more complicated question than it sounds.
Cointelegraph
GPU hardware firm riles gaming community by flirting with crypto miners
Cointelegraph
2 min read · Feb 17th · Gamers were up in arms after a gaming hardware firm appeared to market the already scarce Nvidia 30 series GPU to crypto miners.
MacRumors.com
Octane X GPU Renderer Comes to Mac App Store
MacRumors.com
1 min read · Mar 9th · OTOY today announced that the Octane X GPU renderer is now available for free from the Mac App Store, bringing native Octane X Enterprise features to...
More like this
github.com
arrayfire/arrayfire
github.com
10 saves · From 2015 · arrayfire - ArrayFire: a general purpose GPU library.
stardustjs.github.io
Stardust: GPU-based Visualization Library
stardustjs.github.io
1 min read · 12 saves · From 2017 · Glyphs SandDance Daily Activities Squares Isotype<
ZDNet
How the GPU became the heart of AI and machine learning
ZDNet
~12 min read · 20 saves · From 2018 · The GPU has evolved from just a graphics chip into a core components of deep learning and machine learning, says Paperspace CEO Dillion Erb.
infoworld.com
Machine learning comes to your browser via JavaScript
infoworld.com
3 min read · 69 saves · From 2017 · A new JavaScript library runs Google's TensorFlow right in the browser with GPU acceleration—a novel way to bring machine learning to the masses
Adrian Rosebrock
How-To: Multi-GPU training with Keras, Python, and deep learning
Adrian Rosebrock
20+ min read · 19 saves · From 2017 · In this tutorial you'll learn how you can scale Keras and train deep neural network using multiple GPUs with the Keras deep learning library and Python.
blog.jwilm.io
Announcing Alacritty, a GPU-accelerated terminal emulator
blog.jwilm.io
7 min read · 24 saves · From 2017 · Initial source-only release of Alacritty
OpenAI
Block-Sparse GPU Kernels
OpenAI
4 min read · 10 saves · From 2017 · We’re releasing highly-optimized GPU kernels for an underexplored class of neural network architectures: networks with block-sparse weights. Depending on the chosen sparsity, these kernels can run…
github.com
sw.kovidgoyal.net
kitty - the fast, featureful, GPU based, terminal emulator — kitty documentation
sw.kovidgoyal.net
7 min read · 18 saves · From 2018 · Offloads rendering to the GPU for lower system load and buttery smooth scrolling. Uses threaded rendering to minimize input latency. Supports all modern terminal features: graphics (images), unicode,…