With copilot included in Professional-grade Office 365 and some politician claiming that their government should use AI to be more efficient. I am curious on whether some of you did use “AI” to get some productive things done. Or if it’s still mostly a toy for you.
Really worth listening to this podcast as well. It’s a guy teaching corporate teams to make best use of AI. He goes over how to use it to get really great use by using it as a discussion rather than just asking it a question and expecting and accurate answer in the first instance
AI has been most useful for tech support for me. I wouldn’t have been able to switch to Linux completely if AI didn’t instantly find solutions for me, rather than being told by the community to read tomes of documentation.
I also use it a lot to find how to get office apps to do what I want.
I’m famous at work for being a poet, when I actually just ask AI to write a short witty poem.
You can use image generators to make nice personalised cards to share on special events.
AI can make mind maps and things like that if you tell it what you want.
I would say that I have used an LLM for productive tasks unrelated to work. I run a superhero RPG weekly, and have been using Egyptian & north African myths as the origin for my monsters of the week. The LLM has taken my research and the monster-creating phase of my prep from being multiple hours to sometimes under one hour - I do confirm everything the LLM tells me with either Wikipedia or a museum. I can also use an LLM to generate exemplary images of the various monsters for my players, as a visual aid.
That means I have more time to focus on the “game” elements - like specific combats, available maps, and the like. I appreciate the acceleration it provides by being a combined natural-language search engine and summary tool. Frankly, if Ask Jeeves (aka ask(dot)com) was still good at parsing questions and providing clear results, I would be just as happy using it.
I use it all the time, and not just for myself or for work. Yesterday I fed my son’s study guide into ChatGPT and had it create a CSV file with flash cards for Anki. It’s great at any kind of transformation / summarizing or picking out specific information.
When school sends me overly verbose messages about everything that’s going on I can feed the message into ChatGPT and have it create an ical file that has events for the important stuff that happens in school in in the coming week.
I used it to write a greeting card for my dad on his birthday (“I’m giving him X, these are his interests, give me ten suggestions for greeting cards”).
I have it explain the reasons behind news stories (by searching for previous information and relating it to the news story). I ask tons of questions about anything I wonder about in the world such as chemical processes, the differences between oil frying and air frying, finding scientific papers about specific things, how to factory reset my Bose headphones… the list goes on.
Its great for reading docs, I don’t have to search through to find out how to get a user’s playlists using the Spotify api
Copilot makes for a great autocomplete while programming. Saves me a ton of typing.
My physics professor has us compare our answers to physics problems with a LLM’s output. Somehow, the AI is even worse at physics then I am, it once simplified (4pi2) to 4.
I work in maintenance. I’ve looked some stuff up.
I don’t use AI for productive work, for the same reasons I don’t stir my soup with a dishrag.
Pretty good for recipes, tho’.
I have used it as a nicer version of web search, mostly for “How do I write code using this library I’m not yet familiar with?” It provides passable tutorials when the library’s documentation is sparse (I get it) or poorly written (they tried 🤷♂️).
I use it in two ways.
ChatGPT as an interactive search. Last one was about EU GDPR compliance checklist to give a quick answer on what areas need to be looked at. I use it like once a week for work.
Productive in othen ways I use it once a month for recipes. Recipes are probably my favourite since I can say “Write it using grams and ml” and "give me some options to replace eggs and it writes out a legit recipe based on these millions of annoying blogs recipes.
Jetbrains AI auto complete for programming which is getting better slowly and I’m getting the hang of using it. It’s really good for cases where I have a common thing that I don’t remember the syntax of and I just type a name of a variable like “cspHeaderValue” and it will format thing that’s very annoying to look up based on what I some values I wrote above.
I’m not a 10x engineer now for it, it’s more like +10% overall and really depends on the task. I can see it go up to around +50% but an AI plateau might come before then.
When I had a mold problem it was affecting my mind. I couldn’t think straight or focus, so I had ChatGPT make me a step by step plan for dealing with it, and it had it break each step down into nested sub-steps until no step was more than five minutes of effort, then I had it format the plan to copy-paste into workflowy.
It was really helpful. I could have made that plan myself, except that I was fucked up.
I have been this week, for the first time.
I’m using Hugo to design a new website and Gemini has been useful in find the actual useful documentation that I need. Much faster and more accurate than trawling the official pages, and does a better job of providing relevent examples. It’s also really good at sensing what I’m actually asking, even if I’m clumsy at the phrasing.
And for those who continue to say AI isn’t really useful for learning - another thing I’ve been using it for. “write perl to convert a string to only container lowercase, converting any non-alpha chars to dashes” - I’ve learned how to do stuff like that over and over again, but the exact syntax falls out of my head after a few months of not doing it. AI is good at providing a quick recollect. I’ve already learned perl properly (including from paper books - yes, I first wrote perl a quarter of a century ago) - and forgotten it so many times. AI doesn’t prevent me learning, just makes it faster.
Pretty useful for software engineering, particularly helpful in writing a test suite, you still need to actually check the output though ofc
Also made use of it for writing my end of year review to solve the blank page problem, I find it a lot easier to edit down than starting HR stuff like that entirely from scratch
I used AI to generate random fake data to use in training on Excel , also to understand various concepts in my feild of study and to answer my sudden random questions
I use it as a glorified Google search since Google search is absolute dogshit these days. But that’s about it. ChatGPT is one of the most over hyped bullshit I’ve ever seen in my life.
It seems like exactly the moment google’s successor showed up, google has a stroke. it’s awful these days
Absolutely agree!! LLMs are good for quick “shallow” search for me (stuff I would find on google in a few minutes). Bad for “deeper” learning (because it’s not capable of doing it). It’s overhyped.
You shouldn’t use it for search like that. They (Gemini and ChatGPT) love to be confidently incorrect. Their perfect grammar trick you into believing their answers, even when they are wildly inaccurate.
certain offerings like MS’s cite their sources inline. i always use it to find those sources and then read it from the sources.
I think I’m going to disagree with the accuracy statement.
Yes - AIs can be famously inaccurate. But so can web pages - even reputable ones. In fact, any single source of information is insufficient to be relied upon, if accuracy is important. And today, deliberate disinformation on the internet is massive - it’s something we don’t even know the scale of because the tools to check it may be compromised. </tinfoilhat>
It takes a lot of cross-referencing to be certain of something, and most of us don’t bother if the first answer from either method ‘feels right’.
AI does get shown off when it’s stupidly wrong, which is to be expected, but the world doesn’t care when it’s correct time and again. And each iteration gets better at self-checking facts.
I have it provide me with its sources
I use GPT in the sense of “I need to solve X problem, are there established algorithms for this?” which usually gives me a good starting point for actual searching.
Most recent use-case was judging the similarity of two strings: I had never heard of “Levenschtein distance” before, but once I had that keyword it was easy to work from there.
Also: cmake and bash boilerplate
Describing a concept and getting the term is awesome with an LLM.
I’ve found documentation and discussions of various strategies I’m considering in tech work.
I describe my idea, the LLM gives me the existing term for that strategy, and then I can find discussion, guides, and theory about that. Keeps me from reinventing the wheel.
It makes sense when you think about it too: It’s a language model, so it should be expected to do a decent job as a glorified dictionary
I use FastGPT on Kagi and it lists the sources for its conclusions, so it’s like a better aimed search
As if Google is any better
Copilot is actually linked directly into their search engine and it provides the links it pulls its data from.
But you’re correct, ChatGPT is not hooked into the live internet and should not be used for such things.I’m not sure if Gemini is or not since I haven’t used it or looked into it much, so I can’t comment on it.Edit: I stand corrected, ChatGPT is hooked into the live web now. It didn’t used to be and I haven’t used it in awhile since my work has our own private trained model running that we’re supposed to use instead.
That’s not correct. ChatGPT is hooked into the live web.
Ah okay, it didn’t used to be when I used it awhile back. I edited my comment, thanks for the correction.
Chatgpt also pulls from the web and cites its sources.