I am trying to automate a Python script with a batch file, and the code works fine on my own computer but runs into an operational error "too many SQL variables" when I run it with the batch file on a remote desktop. This is apparently because the limit on a sql query is 999 parameters, and mine has more than the limit. How do I actually increase this limit or break the data into chunks of 999 cols? I came across many posts saying to increase this limit at compilation but I don't know how to do so, and the to_sql has a field called chunk but that's for rows and not columns.I'm using SQLite My python code to insert data is:
df.tail(1).to_sql("table", sqlcon, if_exists="append", index=True)
Thanks !
A schema with more than 999 columns should probably be rethought. That said, here's how to work around it.
You can upgrade to SQLite after 3.32.0 when SQLITE_MAX_VARIABLE_NUMBER defaults to 32766. And if you need more than that, you are not allowed to design databases.
Otherwise, if for some reason upgrading is not an option, the hard coded limits can only be lowered at runtime. If you want to raise them you will have to recompile SQLite with a higher SQLITE_MAX_VARIABLE_NUMBER. This will make your program difficult to deploy using standard dependency managers.
Yes I've thought about that, but for this purpose, because the rows are basically dates and the column names are securities that we need to store stuff for, I don't think I can really change it.
This is a job for a join table.
create table securities (
id integer primary key,
symbol text not null unique,
name text not null
);
create table security_prices (
security_id integer not null references securities(id),
retrieved_at datetime not null,
price integer not null
);
select symbol, price
from security_prices sp
join securities s on s.id = sp.security_id
where retrieved_at = ?
この記事はインターネットから収集されたものであり、転載の際にはソースを示してください。
侵害の場合は、連絡してください[email protected]
コメントを追加