@benni I mean, that depends on your data? A data migration is just Python code, so there's no generation involved, apart from makemigrations --empty
@rixx ok. let me rephrase the question. my project only works with a complex set of start data in my database. in the moment i use a fixture, but this fixture brokes often if i change the schema. is there a way to populate a fresh database with this complex start data without writing a data migration by hand?
@benni That sounds like "can I take this complex, dynamic data and put it in a simple static interface", so I think the answer is no ;)
You could write a manage command that just reads a json file and builds objects, and if you change your models, you make simple tweaks in the manage command, too.
@rixx ok. but than it isn't it easier to read the json directly as a fixture?
@benni Well, sure, but I thought that was giving you trouble with variations in the data model?
@rixx yes. but your solution would generate even more trouble :D i think about an script, which reads the fixture, migrates the database and writes the fixture again. something like this would be helpful.
@benni I'm using my solution in a lot of projects, and it's no trouble at all (and certainly less trouble than other solutions I'm aware of).
@rixx ok. maybe i am missing your point, but how would it be easier to change an json file _and_ an python file compared to just change the json?
@benni I wouldn't change the JSON (unless the data model changes fundamentally, and in that case no fixtures will be able to persist in any case). I'll just add some adjustments in the JSON transform process, leaving the data untouched.
@rixx ah. now i got it. thanks :D
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!