Conversation
|
I'm reading it on mobile phone so I'm probably missing things doing r/w on full file scale probably won't help with the performance. i might be able to come up with something that is better but only on weekends from my experience, anything other than full wal is unreliable. it has to be enabled for sqlite to be robust against corruptions |
|
@wfjsw I read from the entire database once at the launch of web UI subsequent write if any are only per entry, it does not be rewrite entire database |
|
wow 500MB 🤯 |
|
Finally got around to testing it Nice size reduction, startup took a while to create the new database though, unsure how long since I AFK'd while it was doing it, but .json cache rebuild times are also long. Everything else seems working exactly as with the older cache, fresh launch after a (re)boot takes about the same time but "seemed" a tiny bit faster, subsequent relaunches are also fast as before. |
when and if this is merge I intend to write a migration sequence that would use the json file build the database because currently I consider this to be in "experimental" phase I didn't want to add extra logic fot migration yet the migration logic should be relatively simple
the main benefit of this should be when writing new cache it doesn't have to rewrite the entire file |
|
TBH I think we should just use https://pypi.org/project/diskcache/ instead. It's well battle-tested, and SQLite (+disk for large objects) backed. Also, for the app it looks just like a dict, which is very useful. |
|
cool, didn't know about diskcache |
|
merged #15287 |



Description
as opposed to using json file use a SQLite database of cache
reason of this because we have seen some users with extremely large cache files
like catboxanon cache.json is > 236MB
it's becomes increasingly impractical to use it as a method of storage
and we have seen some cases of the the
cache.jsonbeeing being corruptedfor unknown reason reasons #11773the data will not be stored under
cache_database.dbor the path specified by envSD_WEBUI_CACHE_DATABASEto make as little changes as possible I essentially implemented a translation layer that essentially represents the database as a python object same as if we load and save a json file
assuming that I have done everything correctly it should be basically transparent
the implementation is meant to be simple, but this also means not truly optimized for an SQL database
I wrote json strings into the
valuecolumn for the table for each individual entriesthis would mean they will be a slight overhead when reading and writing to the database to do the json conversion
currently this is implemented as a toggle which can be changed under
settings > system > Use sqlite for cacheI store the value under the key
experimental_sqlite_cacheif this is proven stable with no compatibility issues we can remove the enable this by default in the futureOther changes
I changed the file modification time comparison for that triggers a cache refresh from
greater thantonot equalthis makes sense because you might replace a new file with another file that has a older modification date
if this is the case then the cache refresh will not be triggered, this is undesirable
TODO
if in the future this is going to be set at as the default
it should be a simple matter to write a migration script to migrate
cache.jsonto the db so that users don't have to spend time recalculatingDatabase table structure
Table: hashes
Table: safetensto-metadata
this will be read into memory as one big object, structure mimicking
cache.jsonChecklist: