Improvements made to Indexer's API modules, better error handling for missing or incomplete shows, shows that don't have actor or banner info are now properly error handled as well.
I get this error when using API:
{"result":"error", "message": "error while composing output: "utf8 : terça 9:00 : error 3 : error 4 : invalid continuation byte"}
There's a special char "TERÇA"
Don't know why this happens. I had a special date style and even after change it to "yyyy-mm-dd" this error still happens. Don't know why the api builder is still using the translated week day. Also timezone is set to "network"
This error only happens when missing/future has "terça" - Tuesday and "sábado" - saturday weekdays. The other week days there's no special char.
Move local/network setting datetime convert into it's own function.
Function parse_date_time() now returns when possible a correct timezone aware datetime.
Change webapi to use new converter.
Fix Daily Searcher.
Fix saving old DateTime setting.
Add safety check if network_dict is already loaded.
The API should return application/json Content-Type for JSON data instead of application/html. Tornado expects a dict in order to JSON encode and send header "Content-Type:application/json". SR already encodes data plus it supports JSONP. So, the encoded string data is wrapped into a dict, and a test is added at Tornado def write() to unwrap and skip encoding, in order to set the correct content-header and also use JSONP.
Added file HACKS.txt to serve as a reminder for anyone updating the library.
Provider getURL and downloadResult functions now removed and replaced with ones from helpers.py to help slim the code down plus allow more better control over request sessions.
Removed TVTumbler code.
Fixed HDBits provider.
Fixed config settings that were ment to be booleans but instead where set as str or int, should help resolve random html errors.
XEM Refresh check re-coded.
NameParser code for creating show object has been changed to only attempt at the very end once its found the bestMatch result, helps on resources and performance.
Indexer mapping now uses indexer api calls to gather its information and then stores it to a new table called indexer_mapping for instant lookups later on.
Fixed trakt related issues for adding new shows and syncing.
Centered items at bottom of pages to just look a little nicer and fit properly.
Fixed charmap issues for anime show names.
Fixed issues with display show page and epCat key errors.
Fixed duplicate log messages for clearing provider caches.
Fixed issues with email notifier ep names not properly being encoded to UTF-8.
TVDB<->TVRAGE Indexer ID mapping is now performed on demand to be used when needed such as newznab providers can be searched with tvrage_id's and some will return tvrage_id's that later can be used to create show objects from for faster and more accurate name parsing, mapping is done via Trakt API calls.
Added stop event signals to schedualed tasks, SR now waits indefinate till task has been fully stopped before completing a restart or shutdown event.
NameParserCache is now persistent and stores 200 parsed results at any given time for quicker lookups and better performance, this helps maintain results between updates or shutdown/startup events.
Black and White lists for anime now only get used for anime shows as intended, performance gain for non-anime shows that dont need to load these lists.
Internal name cache now builds it self on demand when needed per show request plus checks if show is already in cache and if true exits routine to save time.
Schedualer and QueueItems classes are now a sub-class of threading.Thread and a stop threading event signal has been added to each.
If I forgot to list something it doesn't mean its not fixed so please test and report back if anything is wrong or has been corrected by this new release.
Removed multi-threading as it was more problems then usefull for now.
Added in match & snatch, any quality from initial quality settings gets downloaded first and does not continue searching, if archive qualities exist it'll stop once it hits max quality from said list.