Using solr/pysolr with flask sqlalchemy -


I am trying to setup solr to use with postgres db, which I can do via flask scalchemy orm I have used Library Pirolor for this purpose, but it is not clear how to set up the Scribal Index to be installed within the Skelechyma model. Is there an example?

Pysolr suggests manually through solr.add, inserting documents, but it is not clear how will you separate the index for different database tables

After doing some research, I have come from the following perspective, I am thinking that this is the correct way:

  1. in the ORM model, after hook, after_update, after_remove and after_commit and In these events, incorporate object data into solr / update / free Alen.

  2. Single document Solr_id = db_table_name + db_id

  3. When you do a search, get all the results, filter manually, which match the required DB table, remove the ID, search DB against those IDs, and use those DB results.

Is there a better way to do this? Thank you.

SQLAlchemy and solar are different. I think the better solution is to implement a script to synchronize the data. For new data maybe run the script for updating 30 minutes or one hour.

Because if there is a problem with your solar services, then your website (about the access database) will be affected. Keep Distance Services Independent


Comments

Popular posts from this blog

ios - How do I use CFArrayRef in Swift? -

eclipse plugin - Run java code error: Workspace is closed -

c - Error on building source code in VC 6 -