Take Control Books on A (not) I, fyi

Interesting and thoughtful policy from Joe Kissell:

https://www.takecontrolbooks.com/our-stance-on-ai/

7 Likes

That’s typically Joe; even-handed and level-headed. I’m taking a class (online) about using LLMs for practical purposes like file management, but it’s more on the basis of understanding than planning to use them.

Most of the time, I’d rather use Perl, or an automation/shortcuts app.

What I want is a local LLM I train on early copyright free texts I’ve parsed, for textual analysis.

3 Likes

That’s a post worth sharing. Cheers Joe and thanks Lisa for the steer.

2 Likes

I agree with Joe about ā€œAIs are not I.ā€ I refer to it as ā€œASā€ with the ā€œSā€ meaning ā€œstupidityā€.

You can download, install and run many open source LLMs. I have never personally done this, but one good source (which includes a driver app compatible with lots of different models) is Ollama.

There are other ways as well, but I’m nothing even resembling an expert here.

For a very long time (including many posts here), I have refused to use the term ā€œAIā€ when discussing this tech. I instead use non-emotionally-charged terminology like Machine Learning (ML), Computer Vision (CV) or Large Language Models (LLMs). Because, although these neural-network algorithms can do lots of useful things, there is nothing intelligent about it.

There are some research groups actually trying to create machine intelligence, but that research is far far away from being what today’s purveyors of chatbots are claiming about their products.

One really great book on the subject (from 1995) is Douglas Hofstadter’s Fluid Concepts and Creative Analogies. This isn’t so much a book on AI as it is a collection of papers (from Hofstadter’s research group) describing various kinds of algorithms intended to explore and understand what it means to be intelligent (whether human or machine).

5 Likes

Yep, it is very susceptible to the ā€œGIGOā€ process.

Garbage in, gospel out?

3 Likes

Hofstadter’s ā€˜Godel, Escher, Bach’ was huge in my graduating year.

Close, but no cigar! I’m guessing you are just being facetious. :laughing:

What a well-written, informative piece by Joe.

He acknowledges his own use of the bots to code, debug and apply in so many ways that LLMs improve our lives. Yet he is spot on when he reasons that a Take Control book by — or on — A.I. is ridiculous.

1 Like

With all due respect to Joe, and acknowledging that I would have made exactly the same decision with respect to a book on AI, I am frustrated that it’s so hard to keep up with the legitimately exciting things that are happening in the AI world. It seems that at least some developers are able to devote the necessary time to figuring out how to integrate Claude Code into their actual work, but I find that it takes me quite some time—at least weeks, if not months—to figure out the best way to use any new tool for real work. With AI, that’s a tremendously annoying amount of time, since it means that just as I’m feeling as though I’ve wrapped my head around the utility of what I’ve been doing, some new development calls into question the entire scenario.

For instance, I’ve been experimenting with various AI dictation tools. I like the idea of them, and I use dictation heavily on the iPhone for messages and reminders and the like. But for actual writing, it has proven much more difficult since they all work by sending voice off to a server for transcription, which means there’s some delay. It turns out that I don’t actually know what word I’m going to write next until I’ve seen the previous ones in a lot of cases so I’ve had trouble working them into my workflow. I’m not certain if that’s a failing on my part or something to do with their design or something else, but each time I get frustrated with one, another comes out, making me restart the process to see if it somehow works better in my use case.

Ironically, Apple’s built-in dictation has been easier for me to use because it works on a word by word basis, showing each word on screen as it recognizes it. The only problem is that it’s much less capable than the new tools at understanding how to work with proper nouns and punctuation.

(And if you want to talk about dictation tools, I’ll move that to a new thread.)

2 Likes

I admire your persistence and for lack of better term, impersonal, or agnostic, approach to tech, @ace , and your tolearnce of the frustration you describe. Lots of things these days (:wink: insert topic-hijacking snark here) seem to be developing faster than we can keep up with them (because :grinning_face_with_smiling_eyes: insert random conspiracy theory here)!

The appleOSes/services for example, keep getting more and more complicated, intertwined and imho buggy. So since I am not making my living from tech I am actively moving back to paper and pen, investigating Linux and ā€˜dumb phones’, keeping in mind that I still need to interact with an increasingly digital society.

Still, as a former journalist, I enjoy reading about things I’m not acquainted with and learning how people approach them, so please go ahead with the dictation Topic!

(:face_with_peeking_eye: remove parenthetical distractions)

1 Like

Slight tangent, but if you haven’t used Agents and Skills directives, I’d recommend looking into them. These are files which many (all?) AI coding agents (Claude Code and Codex for sure) look for and read before working on your request to ā€œBuild me an app thatā€¦ā€. The best I’ve seen so far for SwiftUI coding in particular comes from Paul Hudson, and is freely available for both platforms.

@LisaS ’s reply mentioning file management reminded me of an excellent blog written by Howard Oakly (eclecticlight.co) on the dangers of using AI when you’re not in a position to judge the validity/appropriateness of AI ā€˜answers’: How online search and AI can install malware – The Eclectic Light Company . Very scary.

And with Claude’s latest advances in discovering zero-day exploits the dangers are not getting any less.

1 Like