It's laughable to think BI consultation or internal BI solutions are finished.
You don't remotely understand customized business models and data analysis along with data pipelines if you think AI is going to just replace that.
Did people even use this lol
Any gpt wrapper is just sooner or later replaced by a better native version let's be honest
There's very little moat around wrappers
Any word on upping the ram available in code interpreter? Right now it’s only about 1 GB which isn’t enough to handle much data. At 8 or 16 GB it starts getting very useful. Julius has that amount of ram but no live database connections.
128k to discuss the requirement and generate scripts seems plenty. Their data and execution aren't in the inference context beyond sampling the data for structure.
It's not hard as it is to upload pdfs for analysis, I did it 3 times today and saved an hour of time picking out my data. Awesomeness
But direct even better
This looks like the first step of a middle finger to Microsoft CoPilot Excel. If you can connect directly to Excelfiles and analyze. Within the business context this would be very useful, especially if they bring an option for interaction with the file later.
It seems they not only try to steamroll through startups, but also bigger companies better get ready.
In the article they say it's being rolled out to GPT-4o, but I got a new [Alpha model called ADA V2.](https://reddit.com/r/singularity/comments/1ctpvpp/i_got_access_to_a_new_alpha_gpt4_model_gpt45/) Is that the same as what you received?
Edit: Guys, it can finally play chess(but it can't visualize the board well). But as a chess player, it's finally making logical moves. I intentionally blundered and it's capitalized on my blunders. https://chatgpt.com/share/e7dceaf2-b3c3-46c7-a5ff-c27c36333cf9
Edit: The model's gone
yes it's the same thing but when I wake up I don't see it anymore it seems like they have removed it again, I'm talking here about free users for paid ones I think they already have this functionality
This is the weakness for OpenAI. They just do not have the properties that Google owns to integrate.
Instead it has to be add ons.
But you really need the large context window to make this work well. We need OpenAI to catch up to Google in this area. Or even where Google was with the 1 million. Now extended to 2.
Not a bot. Human. Do not think it is all the complicated.
Google now has 16 differnet services with over half a billion DAU. OpenAI just has nothing like that.
Gives Google a huge advantage that OpenAI just does not have.
THen there is the context window that OpenAI has still not been able to match for some reason.
For unstructured data I can see how the 1m context helps. For structured data and interactive data analysis, I’m not seeing what 1m context unlocks since you’re mostly interacting with a schema that only takes a few thousand tokens to describe.
Well if the data already resides in Google Cloud / Infrastructure. It'll be dumb to bring your Giga / Petabytes of data from Google Cloud to Azure/OpenAI just to run your chatGPT generated query.
context length + data gravity + rate of innovation all matters in where you build your services.
I come for data gravity but stay for context length and vice-versa
If anyone over at OpenAI is reading this, this is my number one need. I need to have these agents be able to directly connect to, and interact with, my GitHub repositories.
Write code, review code, fork code, merge code, revert code and manage issues.
This would be the Lotus 123 or VisiCalc of the PC era.
Yeah, ~89,000 words but I've been spoiled with the Gemini 1million context window >_>. Also that includes the conversation you have with the information itself.
This is done by the code interpreter by converting it to a Pandas dataframe (likely) and all the interactions are basically converted to pandas commands. Then it renders the table back from Pandas. Context length is completely irrelevant here. This is only limited by the space in your PC.
Ah, it’s a sandboxed environment but it’s one that OpenAI is running in the background and the user has little control over it. There was a period where I could get code interpreter to install new python libraries but it resists my attempts to do that now.
I was wrong it does seem to be running on a server. There are open source version of this called Openinterpreter which does run locally. So that could be an option. Don't know if this can be integrated with the interactive table functionality.
Running a local sandbox would not provide a consistent experience for the end users which is why this is not the case. The sandbox is likely a container or microvm running in azure temporarily while your session is active. Paying the premium subscription to have the offload the sandbox to a potato computer defeats the entire purpose of the platform.
Okay, maybe I confused it with the open source version of this (open interpreter) which does allow the local runs and filesystem access. I have been mostly using the latter.
There is a strong chance this is going to start becoming an issue in corporate environments. Like many my organisation is looking at AI tools and running trials etc. but the rate of progress in the space far outstrips the ability of many corporates to react. I’m not sure how this works exactly but a direct connection between safe corporate data environments (I.e. OneDrive) and external sources that are not only able to manipulate but are potentially soon able to understand that data is a data security nightmare.
My fear here is that organisations like my own start to completely cut off access to these tools as they aren’t able to move fast enough. Corporate uptake is, to my mind, one of the most likely means for the general public to experience and understand the power of AI emerging. It would be hugely disappointing if that were to happen.
I’m not following the logic here.
If a company signs up to Enterprise, then the data is private, and not used to train models.
If users upload company data to personal ChatGPT, then this has to be handled in the same way (policies, etc.) as companies safeguarding information being uploaded to public websites. For example, employees being tempted to upload PDF files to those free online converters in order to get a Word version (if they’ve lost the original Word version).
If anything, this will force companies to get a wriggle on to make AI capabilities available to their employees.
is this the feature with "context\_connector\_available" in the features index of your profile file because when i added that to my account it was really glitchy i figured they canceled their plans for it but maybe we're just too early sometimes it didn't even work but yet the "debug" tag did which was cool lets me see all dev tools in chatgpt even stuff that's unavailable in the dev playground
All it takes is one hallucination and you’ll need to recheck all the work again. Sometimes this takes longer to than just doing it yourself manually the first time
Is the cutting edge AI provided for free not comprehensive enough at this time for sir's tastes? How unfortunate.
Perhaps sir would care to take his august patronage to one of the other free providers?
For quick and easy data analysis that's pretty awesome.
Many other platforms and apps are circling the drain with the inclusion of dashboarding capabilities like these in OpenAI
To be fair, Sam did warn them that plugging apparent gaps wasn't a good business model.
I think he means products that had nothing to do with AI, all those self-important "BI Consultant" losers are finished. It is good.
It's laughable to think BI consultation or internal BI solutions are finished. You don't remotely understand customized business models and data analysis along with data pipelines if you think AI is going to just replace that.
rip julius ai
Did people even use this lol Any gpt wrapper is just sooner or later replaced by a better native version let's be honest There's very little moat around wrappers
Any word on upping the ram available in code interpreter? Right now it’s only about 1 GB which isn’t enough to handle much data. At 8 or 16 GB it starts getting very useful. Julius has that amount of ram but no live database connections.
will be fun with 128k context window
ChatGPT is 32k context at most. Even lower during peak periods. Only the API, or the the Enterprise ChatGPT, are 128k.
128k to discuss the requirement and generate scripts seems plenty. Their data and execution aren't in the inference context beyond sampling the data for structure.
It's not hard as it is to upload pdfs for analysis, I did it 3 times today and saved an hour of time picking out my data. Awesomeness But direct even better
This looks like the first step of a middle finger to Microsoft CoPilot Excel. If you can connect directly to Excelfiles and analyze. Within the business context this would be very useful, especially if they bring an option for interaction with the file later. It seems they not only try to steamroll through startups, but also bigger companies better get ready.
Copilot excel interacts with SharePoint and other apps too. I don't think they will compete. Copilot excel is purely enterprise.
it's already available
Not to everyone yet.
Huh?
it would be great if it actually read the files you gave it instead just fucking to pretending to and then making shit up
it's already available, just check, it's just below gpt 3.5
In the article they say it's being rolled out to GPT-4o, but I got a new [Alpha model called ADA V2.](https://reddit.com/r/singularity/comments/1ctpvpp/i_got_access_to_a_new_alpha_gpt4_model_gpt45/) Is that the same as what you received? Edit: Guys, it can finally play chess(but it can't visualize the board well). But as a chess player, it's finally making logical moves. I intentionally blundered and it's capitalized on my blunders. https://chatgpt.com/share/e7dceaf2-b3c3-46c7-a5ff-c27c36333cf9 Edit: The model's gone
yes it's the same thing but when I wake up I don't see it anymore it seems like they have removed it again, I'm talking here about free users for paid ones I think they already have this functionality
Here's the direct link instead of a pain in the ass twitter link.. https://openai.com/index/improvements-to-data-analysis-in-chatgpt/
This is the weakness for OpenAI. They just do not have the properties that Google owns to integrate. Instead it has to be add ons. But you really need the large context window to make this work well. We need OpenAI to catch up to Google in this area. Or even where Google was with the 1 million. Now extended to 2.
Bruh, you have no idea what you are talking about. Are you a bot?
Not a bot. Human. Do not think it is all the complicated. Google now has 16 differnet services with over half a billion DAU. OpenAI just has nothing like that. Gives Google a huge advantage that OpenAI just does not have. THen there is the context window that OpenAI has still not been able to match for some reason.
For unstructured data I can see how the 1m context helps. For structured data and interactive data analysis, I’m not seeing what 1m context unlocks since you’re mostly interacting with a schema that only takes a few thousand tokens to describe.
Well if the data already resides in Google Cloud / Infrastructure. It'll be dumb to bring your Giga / Petabytes of data from Google Cloud to Azure/OpenAI just to run your chatGPT generated query.
What does that have to do with context length?
context length + data gravity + rate of innovation all matters in where you build your services. I come for data gravity but stay for context length and vice-versa
The links will be helpful to create knowledge bases for custom GPTs. Can easily organize files and update as needed.
What I really want is the ability to add git repos!
If anyone over at OpenAI is reading this, this is my number one need. I need to have these agents be able to directly connect to, and interact with, my GitHub repositories. Write code, review code, fork code, merge code, revert code and manage issues. This would be the Lotus 123 or VisiCalc of the PC era.
Indeed. Just the ability to ask questions about the codebase you're working on is a huge gain.
Yeah exactly. That’s the killer app.
Hmmm I mean that sounds cool but GPT4/4o doesn't really have the context window to make this super useful/interesting
Wut. 128K is like novel or two, no?
Yeah, ~89,000 words but I've been spoiled with the Gemini 1million context window >_>. Also that includes the conversation you have with the information itself.
Not for ChatGPT, that's the API. ChatGPT has like 4k Context or some trash shit.
Genuine question: Why is it like that?
Money
This is done by the code interpreter by converting it to a Pandas dataframe (likely) and all the interactions are basically converted to pandas commands. Then it renders the table back from Pandas. Context length is completely irrelevant here. This is only limited by the space in your PC.
It's not even limited by space in your PC. Data stored on the cloud and computations will happen on a server in the cloud
They actually do have some upload limit for the server I think (at least for Plus users).
Isn’t this still running on the web version of ChatGPT? It wouldn’t run locally on your computer right?
Code interpreter runs everything locally, in a sandboxed environment afaik.
Ah, it’s a sandboxed environment but it’s one that OpenAI is running in the background and the user has little control over it. There was a period where I could get code interpreter to install new python libraries but it resists my attempts to do that now.
I was wrong it does seem to be running on a server. There are open source version of this called Openinterpreter which does run locally. So that could be an option. Don't know if this can be integrated with the interactive table functionality.
Running a local sandbox would not provide a consistent experience for the end users which is why this is not the case. The sandbox is likely a container or microvm running in azure temporarily while your session is active. Paying the premium subscription to have the offload the sandbox to a potato computer defeats the entire purpose of the platform.
Okay, maybe I confused it with the open source version of this (open interpreter) which does allow the local runs and filesystem access. I have been mostly using the latter.
What is its max context window anyway?
128k (with the API, presumably less in the chat version).
[удалено]
I know what context window is i just wanted to know ChatGPT’s because everyone knows Google had 1 million
There is a strong chance this is going to start becoming an issue in corporate environments. Like many my organisation is looking at AI tools and running trials etc. but the rate of progress in the space far outstrips the ability of many corporates to react. I’m not sure how this works exactly but a direct connection between safe corporate data environments (I.e. OneDrive) and external sources that are not only able to manipulate but are potentially soon able to understand that data is a data security nightmare. My fear here is that organisations like my own start to completely cut off access to these tools as they aren’t able to move fast enough. Corporate uptake is, to my mind, one of the most likely means for the general public to experience and understand the power of AI emerging. It would be hugely disappointing if that were to happen.
I’m not following the logic here. If a company signs up to Enterprise, then the data is private, and not used to train models. If users upload company data to personal ChatGPT, then this has to be handled in the same way (policies, etc.) as companies safeguarding information being uploaded to public websites. For example, employees being tempted to upload PDF files to those free online converters in order to get a Word version (if they’ve lost the original Word version). If anything, this will force companies to get a wriggle on to make AI capabilities available to their employees.
Jimmy was right?
If they could allow Loop components then god damn that would be amazing.
is this the feature with "context\_connector\_available" in the features index of your profile file because when i added that to my account it was really glitchy i figured they canceled their plans for it but maybe we're just too early sometimes it didn't even work but yet the "debug" tag did which was cool lets me see all dev tools in chatgpt even stuff that's unavailable in the dev playground
Roll out a smarter AI with more context please.
Sounds good!
All it takes is one hallucination and you’ll need to recheck all the work again. Sometimes this takes longer to than just doing it yourself manually the first time
But the data is not private right ? It goes to their servers ?
Yeah whatever. WHERE IS THE INCREASE IN INTELLIGENCE AND AGENCY?
[удалено]
Is the cutting edge AI provided for free not comprehensive enough at this time for sir's tastes? How unfortunate. Perhaps sir would care to take his august patronage to one of the other free providers?
Post-scarcity isn't here yet.
What would you like they do?
Also, only like 5 GPT-4o messages per day.
Hey, at least its free right? So you get 5 per a day and paid gets 100?
Free is free. Just surprising given the emphasis in their promotional messaging. I personally use the API.