Can One MySQL DB hold web site contents and index each web site?
Hello,
I am a new user of Sphinx. I am trying to scrap many web sites and storage into single DB. I would like to run indexer DB contents into each single web site index file. User can search web content from specific web site index during web engine search. Can this be done in Sphinx?
Thank You
Mike Dawn
Keyboard shortcuts
Generic
? | Show this help |
---|---|
ESC | Blurs the current field |
Comment Form
r | Focus the comment reply box |
---|---|
^ + ↩ | Submit the comment |
You can use Command ⌘
instead of Control ^
on Mac
Support Staff 1 Posted by Pat Allan on 29 Jan, 2017 01:03 AM
Hi Mike,
Once the data is in a MySQL or PostgreSQL database - however it’s sourced - it can be indexed by Sphinx.
I’ve actually written a service similar to what you’re thinking - it takes in HTML data from static Jekyll-generated sites, and makes them searchable with Sphinx. The code is open sourced as well - if you’re familiar with Ruby it may be useful inspiration:
https://github.com/pat/drumknott-server <https://github.com/pat/drumknott-server>
It’s worth noting that in this case I’m using real-time indices, which means I’m communicating the data updates via Sphinx’s SphinxQL protocol, rather than using Sphinx’s indexer CLI tool. Either approach would work fine in your case.
Regards,
—
Pat