Jump to content

Chat GPT - Really


Dox
Go to solution Solved by Dox,

Recommended Posts

ChatGPT has been trained on large amounts of data on the Internet, and can make stuff for just about anything, including programming languages. That doesn't mean it can make stuff that is good, works, or is valuable - results vary.

  • Like 2
Link to comment
Share on other sites

6 minutes ago, Dox said:

It sounds to me like this ChatGPT is a complete waste of time.

No, it's an amazing technology (with significant flaws that are getting improved).

6 minutes ago, Dox said:

But how does "it" manage to write scripts for Logic?

The exact same way it writes code for any language, or poetry, or screenplays, or whatever - the process is exactly the same. I'm not going to go into how it works here, but it doesn't "know" Logic more than it "knows" anything else. All it does, in simplistic terms, is determine what's the next most likely word to appear in the response.

There's not a lot of specific Logic Scripter code written out there on the internet, so it's results will likely be worse than something more widespread, like general Javascript or PHP or something. And ChatGPT will happily give you a confident answer that is completely wrong.

We've already had people here posting ChatGPT-generated Scripter code, and saying "this doesn't work, can you fix this please?"

Edited by des99
Link to comment
Share on other sites

1 hour ago, enossified said:

Shall do? It sounds like ChatGPT wrote that quote 😉

Some of the people I work with use “shall” almost exclusively. Drives me bonkers! 😅

25 minutes ago, Dox said:

I think I will avoid this "amazing technology" until it has some reliable use.

It is pretty amazing already, but for help with code, the more you already know about coding yourself, the better. When I use it it usually takes me a few tries to get it to give me something I’m happy with.

As des99 said, it doesn’t know Logic’s Scripter that well, and usually gives you code that although it’s valid JavaScript, it won’t work in Scripter unless you tweak it yourself or you tell it how to correct it.

J.

  • Like 2
Link to comment
Share on other sites

I recently tried  chatgpt.  With a one sentence request it was able to write logicpro scripter scripts.  Some of them even worked!  They often had complete fiction though and no chance of working with totally made up functions which don’t exist in the scripter api.

how and where chatgpt got this fictitious info is anyone’s guess.  I can see it saving some time to code out Skeleton code as long as you are willing to fix the fiction.

it was particularly bad when I asked it if it could do things which I was confident scripter can’t do.  Instead of getting a “sorry not possible” response, I got a completely fictitious solution.  At first I thought chatgpt had dug some undocumented features out of the blogosphere somehow but nope it was completely making up out of thin air non existing functionality but with well formed JavaScript.

moral of the story, chatgpt is not much more then a language pattern recognizer.  It very often cannot actually analyze the results or reason through the veracity of anything.  Garbage in garbage out.  But what it does do very well is construct grammatically correct language and pretends to act like a smart human being, but really it’s just a dumb actor.  It can still be useful for constructing the skeleton of some scripter code quickly without having to code it all by hand but you still have to know how to read it and edit out the fiction.  I think it can definitely take a lot of the JavaScript pain out of it as it seems to understand JavaScript quite well which is child’s play compared to english

Edited by Dewdman42
  • Like 4
Link to comment
Share on other sites

It's only 'deep' in the sense of how much it has to 'learn' before it achieves whatever is qualifying as 'intelligence' these days.  That said, I'm really interested in task-specific AI, where instead of feeding the engine as many datasets of any published matter to 'learn' from, that it be fed - for instance - everything that pertains to a given operating system and the hosted app in question, like Logic Pro.  The promise of such an AI is every composer/producer would have an expert engineer on hand to implement verbal commands to build and maintain a session!  I'd love to be able to talk my way to building a session template and set it up to record "a vocal track, using X plugins, with four instrument tracks, each with Arturia's Busforce loaded but disabled..."

Maybe we're closer to that than we think?

Link to comment
Share on other sites

11 minutes ago, fr3Q said:

Maybe we're closer to that than we think?

It's probably still a way away, but I agree that if we could get there (perhaps it's inevitable?), it would be a major change in how people use computers - eg, having a computer-based assistant that can actually do the kinds of things a person does - learn your workflow, your preferred ways of doing things, spot things you are doing badly and offer solutions to improve them, pro-actively action things it knows you'll need to do, etc - and be genuinely helpful in the way that doesn't require the user to manually choose and perform every task.

  • Like 1
Link to comment
Share on other sites

3 hours ago, des99 said:

It's probably still a way away, but I agree that if we could get there (perhaps it's inevitable?), it would be a major change in how people use computers - eg, having a computer-based assistant that can actually do the kinds of things a person does - learn your workflow, your preferred ways of doing things, spot things you are doing badly and offer solutions to improve them, pro-actively action things it knows you'll need to do, etc - and be genuinely helpful in the way that doesn't require the user to manually choose and perform every task.

 ...a mirror reflection of yourself in other words...

Edited by Atlas007
  • Like 1
Link to comment
Share on other sites

21 hours ago, Atlas007 said:

 ...a mirror reflection of yourself in other words...

...or a better version of yourself, when it comes to piloting something like LPX.  Lol.  Imagine being a newbie with a deep app, like LPX or FCP, with ai built in.  The learning curve will be less about how to do something, like this or that task, and more about what the thing (the tool) can do based on what you know about the tool, and what commands you can get away with throwing at the ai, that can be implemented in a useful way.  It will actually make rtfm a lot more useful in terms of understanding the nature of an app, as defined by it's particular implementation of functions, rather than a how-to of doing things.  The ai will take care of the how, you just have to know what you want, with the right terminology to get there, which might give you insights into why you want the tool to do these things.  Lol, that brings one right back to ai being a mirror of sorts!  Like driving a car.  We become pilots of the 'vehicle', as much by intuition as by knowing the basic operational aspects of operating the machine, rather than being forced to go through these stages of learning how the drive train operates, or what every mechanic should know, or why the engineer made the trade-offs they did.  As with a car, which takes you from A to B, after learning the basics, you just have to decide what kind of driver you want to be.  A 'safe' driver, or a reckless one!  That's how I expect ai is going to change how we approach using tools like LPX.

Link to comment
Share on other sites

3 hours ago, fr3Q said:

...or a better version of yourself, when it comes to piloting something like LPX.  Lol.  Imagine being a newbie with a deep app, like LPX or FCP, with ai built in.  The learning curve will be less about how to do something, like this or that task, and more about what the thing (the tool) can do based on what you know about the tool, and what commands you can get away with throwing at the ai, that can be implemented in a useful way.  It will actually make rtfm a lot more useful in terms of understanding the nature of an app, as defined by it's particular implementation of functions, rather than a how-to of doing things.  The ai will take care of the how, you just have to know what you want, with the right terminology to get there, which might give you insights into why you want the tool to do these things.  Lol, that brings one right back to ai being a mirror of sorts!  Like driving a car.  We become pilots of the 'vehicle', as much by intuition as by knowing the basic operational aspects of operating the machine, rather than being forced to go through these stages of learning how the drive train operates, or what every mechanic should know, or why the engineer made the trade-offs they did.  As with a car, which takes you from A to B, after learning the basics, you just have to decide what kind of driver you want to be.  A 'safe' driver, or a reckless one!  That's how I expect ai is going to change how we approach using tools like LPX.

Interesting point of view!

Since the creators don't actually know how AI (like ChatGpt) comes up with a given outcome, bets remain open on predictions I guess...

  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...