I use Chat GPT quite a bit and it's almost completely replaced Google as my first source when looking for information. I'm good at searching Google to find the information I need but Ai is much better (most of the time). The more information available about a topic the better it does and while it can certainly get things wrong I'm very impressed overall. It's not uncommon that I still need to search Google for the final answer but Chap GPT usually gets me 95% of the way there so my Google searches are more targeted.
Most recently it's been giving me book recommendations. After some back and forth it seems to have figured out my tastes pretty well. It will give me half a dozen recommendations, which I read and then give feedback on, before giving me more.
I was switching road bike components from one bike frame to another and the rear derailleur would not mount correctly. I told Chat GPT what was happening and how the dimensions of the hanger differed. It explained what the problem was and told me there was an adapter available for my rear derailleur to adapt it. It gave me a part number, which was completely wrong, but it was enough info that a quick Google search turned up exactly what I needed. I could have found out all that information from Google but it would have taken much longer.
A few months ago I was looking for a used minivan to mostly use for traveling. I asked Chat GPT for a list of suitable vans and to list the specs I was most interested in (cargo length, cargo height, ride height, etc). It was able to quickly produce charts comparing not only different brands but also changes within the same model from year to year (what do I gain/lose by getting a 2020 instead of a 2017?) as well as showing the differences between trim levels. It turned into a very long conversation over the course of a month or so as we drilled down what was best for me and my budget. In the end I purchased a Ford Transit Connect that hadn't even been on my radar until Chat GPT suggested it as an option.
I also tried to give it wiring diagrams and asked it to explain system operation on specific vehicle systems (what I do for a living) and it failed dramatically. What it initially spit out looked very impressive but when paying close attention it was getting small but important details wrong. I'd tell it something was wrong and to double check, and it would often find the correction, only to make the same mistake again later. I asked it why it was getting it wrong and it admitted to using general knowledge (such as common relay pin numbers) rather than the specific information from the diagram and vehicle I gave it. It promised not to make those assumptions again but it went right back to old habits a few days later when I picked up the conversation again. These mistakes were obvious to me but could have been disastrous if someone just learning was given the same answers.
For anyone that's curious I recommend playing with it to see how it does. Ask it questions about subjects you're already familiar with so you can judge the quality of its content. That will also show you the best way to phrase questions to get the right information.
Alan