The iVia software is used in a range of projects, including the iVia Virtual Library Software which creates and manages Virtual Libraries, both automatically, and under the direct control of living, breathing, human librarians.
iVia can be used to download pages from other Web sites. When iVia downloads a page, it will report a User Agent string to the Web server; you may see these strings appearing in your Web server logs. Here is an example:
iVia/5.0 SiteChecker (http://ivia.ucr.edu/useragents.shtml)
Please note that iVia is Free Software, written by the INFOMINE/iVia Project, and freely available for download on the Internet. If you see an iVia User Agent string in your logs like the one above that starts with iVia, it means the iVia software was used to download the page: it does not necessarily mean that the INFOMINE/iVia Project downloaded the page (though that is possible). To find out who is using the iVia software, examine the IP address of the requester in your logs. (The INFOMINE Project currently uses IP address 22.214.171.124.)
iVia downloads pages for several different purposes, with several different programs. By default, these are prefixed with iVia and then the version number, then the name of the iVia module that initiated the download.
By default, iVia will respect robots.txt files, though it can be reprogrammed to ignore them. Please note that when a page is downloaded at the explicit request of a (human) user, we consider this the moral equivalent of that person downloading a page in their Web browser, and therefore robots.txt files do not apply. and are explicitly noted below.
iVia user agents used by this project will start with one of the two following strings:
Please see http://www.robotstxt.org for more information on how to set up patterns to restrict robot access.
You can contact us at support_at_ivia.ucr.edu. Please supply us with as much diagnostic information as possible, but at least provide the user agent string, the accessed URL and the time of the access. Also first make sure you have a properly configured robots.txt before contacting us.