I have the following Django code that is being run on PostgreSQL and Huey (an automatic scheduler). The problem is that, whenever the periodic task is run, Django removes the previous rows in a table instead of adding on to existent ones.
Scheduled code:
@periodic_task(crontab(minute='*/1'))
def scheduled():
team = nitrotype.Team('PR2W')
team_id = team.data["info"]["teamID"]
timestamp = datetime.datetime.now()
for members in team.data["members"]:
racer_id = members["userID"]
races = members["played"]
time = members["secs"]
typed = members["typed"]
errs = members["errs"]
rcd = RaceData(
racer_id=racer_id,
team_id=team_id,
timestamp=timestamp,
races=races,
time=time,
typed=typed,
errs=errs
)
rcd.save()
Basically, the above code is going to run every minute. Here's the database (PSQL) data that I started with:
nttracker=# TABLE data_racedata;
racer_id | team_id | timestamp | races | time | typed | errs
----------+---------+------------+--------+---------+----------+--------
35051013 | 765879 | 1623410530 | 4823 | 123226 | 793462 | 42975
35272676 | 765879 | 1623410530 | 8354 | 211400 | 1844434 | 38899
36690038 | 765879 | 1623410530 | 113 | 2849 | 16066 | 995
38486084 | 765879 | 1623410530 | 34448 | 903144 | 8043345 | 586297
38625235 | 765879 | 1623410530 | 108 | 2779 | 20919 | 1281
39018052 | 765879 | 1623410530 | 1908 | 48898 | 395187 | 24384
39114823 | 765879 | 1623410530 | 2441 | 64170 | 440503 | 32594
...
(50 rows)
Afterward, I run Huey, which executes scheduled()
every one minute. Here's what I end up with after two minutes (in other words, two iterations):
nttracker=# TABLE data_racedata;
racer_id | team_id | timestamp | races | time | typed | errs
----------+---------+------------+--------+---------+----------+--------
35051013 | 765879 | 1623410992 | 4823 | 123226 | 793462 | 42975
35272676 | 765879 | 1623410992 | 8354 | 211400 | 1844434 | 38899
36690038 | 765879 | 1623410992 | 113 | 2849 | 16066 | 995
38486084 | 765879 | 1623410992 | 34448 | 903144 | 8043345 | 586297
38625235 | 765879 | 1623410992 | 108 | 2779 | 20919 | 1281
39018052 | 765879 | 1623410992 | 1908 | 48898 | 395187 | 24384
39114823 | 765879 | 1623410992 | 2441 | 64170 | 440503 | 32594
...
(50 rows)
Note: most of the data just happened to be the same, the timestamp's always different among data generated from automated executions.
I'd like 150 rows instead of 50 rows as I want the data to accumulate rather than replace previous ones. Can anyone please tell me where I've got wrong?
If anyone needs additional log outputs, please comment below.
EDIT Model
class RaceData(models.Model):
racer_id = models.IntegerField(primary_key=True)
team_id = models.IntegerField()
timestamp = UnixDateTimeField()
races = models.IntegerField()
time = models.IntegerField()
typed = models.IntegerField()
errs = models.IntegerField()
Thanks in advance.