on 03-31-2004 10:28 AM
Hi,
Does anyone have any experience developing their own crawlers. We need a mechanism that searches specific Repositories and adds custom properties from external datasources to resources.
Or is there some other way to accomplish this (for example, a KM Service)
Thanks,
Dick
Hi Dick,
Although it is possible to develop an own crawler which would be able to do that, I'm not sure if its the thing you want to.
Within the KM, the crawlers are separated in a "navigation layer" which handles the retrieval of resources as well as the navigation through the links, and a "result-handling layer" which does something with the resources found by the "navigation layer".
For searching, the KM's indexmanagement uses it's own implementation of this "result-handling layer" which can not be changed by custom code.
So, if you want to search a specific repository this can be handled by the indexmanagement (just by assigning the specific repository folder as a data source to an index).
To add custom properties from an external datasource to resources (within a specific repository), it might be a good idea to implement a <b>PropertyReadFiler</b>.
This PropertyReadFilter can be either configured to a repository or be registered dynamically by a global service.
In order to have to have these properties being indexed, the have to be made "known" to the indexmanagement, which is done by configuring the appropriate information in the PropertyMetaData.
Best regards,
Paul
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
maybe the Global Service/Property Metadata could help you.
Regards, Josef
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
101 | |
13 | |
13 | |
11 | |
11 | |
7 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.