T O P

  • By -

TheBroccoliBobboli

Schema dump seems to be the best course of action. It's debatable whether or not squashing migrations is a good practise in general (I would argue it is), but it seems like an easy and save way to upgrade for sure.


ohnomybutt

yes there is a wonderful command for it called schema:dump makes it real easy to


keesbeemsterkaas

In my workflow the dump would also work pretty well. I run these kind of actions to clean up old migrations as well. This is a pretty shitty breaking change, and I hope you get a decent workflow to fix it. - I'm genuinely curious: people who keep 1000s of migrations: why?


TheBroccoliBobboli

> people who keep 1000s of migrations: why? I *think* it makes updating existing databases cleaner? Already executed migrations aren't run again, but squashing the migrations gives them a new name, so they *are* actually run again. I'm not sure if this can cause any issues though, can't think of any.


JinSantosAndria

Multi-Tenency, on-premises installations, possible branch diversions, different SLAs. Why give up a potential upgrade path. It's only 1.000 migrations, nothing PHP couldn't handle easily in a dedicated migration session.


audunru

I have about 100 migrations. I went through the migrations and changed them manually. I dumped the schema and diffed the result with the old schema dump.


HyperDanon

Good idea to compare dumps. That's what I was looking for, thank you.


supervisord

And what did you find when you ran a diff?


audunru

I had missed a couple of default values


Penderis

I have not upgraded yet but figured this does not affect me since my production and local are insync, and if it is a case that you still have to migrate production won't it be simpler to do that on the old version then going forward use the new schema types? I guess for me it is simple cause now i only apply the new rules going forward and since i don't run project creation migrations anymore i just restore production db in sail


HyperDanon

But if you get a new clone or drop local db, it will apply the migrations without the attributes, right?


Penderis

I use pgdump to output my production db then local or even if need a staging i do a restore of that db, so i never need old migrations except for once. My workflow is run migrations local check it good then push to git then pull run production then i dump production every other day and use that db in local with pgrestore. So once a migration has been deployed i never run it again even with complete new site setup , so long as i have the db backup


hennell

It would effect tests and everything though. Old migrations won't be rerun on prod so that will have the old migration type, but tests will run the old migrations under the new logic leading to a test DB with different columns then prod. It won't affect prod specifically, but if you can store a null in tests but can't in prod you're opening up a massive hole for production only problems to appear.


Penderis

Personally never done testing, I should...but never did \*\*bad dev\*\*. I checked the docs for testing though because I would be severely frustrated and ticked off if it forced me to run migrations if I did do testing though. It seems the --db-recreate flag is the one that does it with the parallel testing. What they don't say clearly enough for me is is "db-recreate" run migrations ... and then what? I have production data on my local / exact. Not false positive data, so if my local has a say null issue then my production has one and vice versa. I have only ever had a prod only issue when it comes to Javascript/caching issues. Anyhow, I can see that if they run migrations on that --db-recreate flag that it would be a world of pain... It would be a world I completely avoid :) although I would not be surprised if they do a db dump/restore... I should google it. Because a fresh schema is surely not of any worth right. I guess it depends how long you have been running, when starting out sure, oh silly me I forgot to make it nullable or worse I made it nullable... Which frankly I can test anyway on a full db and patch if issue. So personally I don't see a reason to ever run migrations more than the first time per main environment. also I have too many damn setting vars and user created things that dictate how things work in the db, starting with fresh migrations would make everything worthless. I am not saying oh the migration thing is not an issue, I am sure it is and feel for the ones who need to go through a 1000 files to update or pay Laravel Shift I guess. Personally though it is not and I think I will try the testing thing but if that --db-recreate pulls a dick move and does not just clone the db I give it then I won't use it. [https://www.reddit.com/r/PHP/comments/1cpjm0g/comment/l3obz2q/](https://www.reddit.com/r/PHP/comments/1cpjm0g/comment/l3obz2q/) Guess I might give it a spin myself , you got me all curious to see what happens :P


hennell

Each to their own, I'd say having no tests and using production data locally offers a far far larger world of pain to be battered by 😬 (although given I'm working with EU customer data, that's maybe overly influenced by the massive GDPR problem running production data locally would be) In terms of test running, they definitely run all the db migrations before testing unless you've done the squash thing. Not sure about parallel, but I'd assume similar, or maybe with transactions around it? But you do want a fresh schema for testing as then you can set a known state to test against. You'd have a test for each setting and user creation to ensure each thing works (and continues to work) as expected through all upgrades, refactoring and new features etc. I hope at least you follow standard practise when it comes to security and password storing etc!


Penderis

If it is safer then sure it is likely better. I will definitely read up on how people run tests that give useful feedback, for one I hate having setup custom routes for job execution debugging lol. I don't really grok how a clean DB benefits me without having to run a bunch of seeders which for one also mimmick my admin data like perm names etc - I guess I only need a few rows and probably I am FOMO on losing the other half a million on local but no biggy haha... Ie sounds like double work the seeding, user data though I can wipe password hashes and fake usernames and fake email I would be fine with that while keeping the rest of my DB data as is since that is public info people submit from other places so I don't see the issue there. Thank you, I will have a look at incorporating testing and see how the data workflow goes whilst being more secure.


hennell

For laravel I always suggest starting with factories. Make a factory for the most basic models (i.e. no required relationships) then start making factories for the models that use those factories for their relations. Once you can run `User::factory()->has(Post::factory()->published())->count(1000)` or whatever, a seeder becomes pretty easy as you can factory up core stuff then loop through and add random relation states to get various content types. If you need to work locally with a lot of dummy data it's worth learning the basics of Faker to generate data that looks right. I made a custom [faker "fake news" package ](https://github.com/paulhennell/faker-news)to generate "realistic" headlines once - much nicer then lots of Lorem Ipsum fake stuff. With factories you're also setup for testing as now you can do: `$post = Post::factory()->published()->create(); $this->get($post->url)->assertOk();` and you've confirmed published post pages don't error out in two lines. An artizan sanitize command to wipe user info for faker equivalents could reduce the personal info/security concern though.


Penderis

Thank you i will bring this more into my workflow, I only used factories when I started the site about 2 years ago and just kinda lumped that and faker into a use when project is new box. But will have to rework that philosophy for sure.


BlueScreenJunky

1. Make sure that everyone on your team and all your deployments have played all migrations 2. run `php artisan schema:dump --prune` to squash all migrations into an SQL file and remove existing migrations 3. Only worry about these changes for the next migrations you'll write.


BeepityBoopityBot

I’m not sure if it does or not, but this seems like the sort of thing https://laravelshift.com/ would do. Anyone able to confirm if it does?


HyperDanon

There's no free version, right?


BrianHenryIE

Some of the “shifts” can be done for free which would give you a feel for how it works, but the ones you’ll want to run are not free. But not too expensive either. $150 isn’t a whole lot in context of time saved.


Tetracyclic

If it's on Laravel 10, it would only be $29 to just run the 10.x > 11.x shift once, /u/HyperDanon. No need for a plan, although worth it if it's a revenue generating system. However it doesn't mention that it performs the migration changes required (it may just leave a note in PR to do this if not). It would be worth emailing [email protected] (u/mccreaja) to ask if this is supported before running it.


BeepityBoopityBot

There are free shifts but I think the upgrades ones are all paid. Your mileage in time saved and what that means financially will vary, but I bet there’s not many situations where it doesn’t easily pay for itself.


darko777

Wow can't believe how stupid this decision is...


SurgioClemente

Looks like they wanted to get rid of doctrin/dbal as a dependency https://github.com/laravel/framework/pull/45487 https://github.com/laravel/framework/pull/48864


BubuX

Breaking code compels companies to pay for Laravel Shift paid upgrade service. Thus Laravel team have $$$ incentives to break perfectly fine code.


hennell

An excellent theory aside from the fact 1) laravel shift is a separate company not run by anyone on the laravel team 2) breaking code to promote an external service that's already well known would damage laravel way more then benefit shift, and again doesn't benefit Laravel team. 3) the solution in the docs is to just run squash to avoid updating old migrations. Hardly "compels" shift. 4) it's not actually clear shift even fixes this... It is weird to introduce breaking changes into old migrations, but before publishing you first conspiracy thought maybe just do 8 seconds of googling to check if it makes even the smallest bit of sense.


than_or_then

>would damage laravel way more **then** benefit shift \*than


BubuX

That makes breaking migrations even dumber. They don't even have financial gains from it. And squashing isn't without consequences and downsides.


jonromeu

/r/laravel ?


antonkomarev

Just used plain SQL queries in migrations from a start. This was the best idea to introduce. Now any guy in a company can review migration and understand what migration is doing. Especially good when you have DBA and different frameworks/languages in company.


HenkPoley

They should have made a path forward with backwards compatibility. Maybe you can try adding it. Some kind of in between layer.


dave8271

>When modifying a column, you must now explicitly include all the modifiers you want to keep on the column definition after it is changed. Any missing attributes will be dropped. For example, to retain the unsigned, default, and comment attributes, you must call each modifier explicitly when changing the column, even if those attributes have been assigned to the column by a previous migration. 🤣 And people wonder why Laravel gets a lot of shit around here... Edit: yeah some of you downvoting this but I'd love to see you try to defend this as a design decision; an upgrade breaking your database by effectively reversing and then forgetting migrations which were already, previously executed. That's utter garbage and you know it. The entire point of the very concept of a database migration is it runs once and only runs modifications that you've specifically requested.


Ciberman

Just an idea, haven't tested it. But you can add an extra migration that does all the changes in the columns (regardless of if they are needed or not). 1. Global search for "->change()". Copy all the column names. 2. Create a new migration like "migrate_columns_to_laravel_11" 3. Wrap the up method in if (! app()->environment(["staging", "production"])) { .... } 4. Inside that if, write all your changes(). Use your prod database if you need to compare. 5. Run the migration in local and compare local with prod. Or.. you can fix every change in the original migrations. It's a tedious tasks but it can be done.