I've tested many dozens of AI tools and features during the research and writing of this blog. I'd like to have utilised even more, if I had a powerful enough graphics card to download and execute different LLMs then I would have. My Dell workstation though is limited by 32GB and a 4GB NVIDIA Quadro card with ageing Xeon processor. My Dell laptop has 16GB, a 12th Gen i7 and 4GB 3050. These are averagely powerful pieces of hardware, but are insufficient to run much in the way of Open Source AI models, at least ones that I'd use as regular tools. I state this to demonstrate that most people's usage of AI tools is likely to be through 'cloud' services, such as the new (clunky) ChatGPT4 Apple OS phone app. Which is a shame, as it does hinder people's understanding of how these tools work, and what sort of libraries they are dependent upon. This may ease up, as more smaller and capable LLMs become available, but it's probably a bit too late. Why is this of...
6 months to AGI?