Skip to Content

Developing your own crawler

Hi,

Does anyone have any experience developing their own crawlers. We need a mechanism that searches specific Repositories and adds custom properties from external datasources to resources.

Or is there some other way to accomplish this (for example, a KM Service)

Thanks,

Dick

Add comment
10|10000 characters needed characters exceeded

  • Follow
  • Get RSS Feed

2 Answers

  • Best Answer
    Apr 26, 2004 at 07:52 AM

    Hi Dick,

    Although it is possible to develop an own crawler which would be able to do that, I'm not sure if its the thing you want to.

    Within the KM, the crawlers are separated in a "navigation layer" which handles the retrieval of resources as well as the navigation through the links, and a "result-handling layer" which does something with the resources found by the "navigation layer".

    For searching, the KM's indexmanagement uses it's own implementation of this "result-handling layer" which can not be changed by custom code.

    So, if you want to search a specific repository this can be handled by the indexmanagement (just by assigning the specific repository folder as a data source to an index).

    To add custom properties from an external datasource to resources (within a specific repository), it might be a good idea to implement a <b>PropertyReadFiler</b>.

    This PropertyReadFilter can be either configured to a repository or be registered dynamically by a global service.

    In order to have to have these properties being indexed, the have to be made "known" to the indexmanagement, which is done by configuring the appropriate information in the PropertyMetaData.

    Best regards,

    Paul

    Add comment
    10|10000 characters needed characters exceeded

  • avatar image
    Former Member
    Apr 13, 2004 at 09:21 AM

    Hi,

    maybe the Global Service/Property Metadata could help you.

    Regards, Josef

    Add comment
    10|10000 characters needed characters exceeded