File Collector
Introduction to File Collector
Use the File collector to import CSV and XML text files into Historian. Since the files can contain data, tags, tag properties, and messages, the File collector is a very useful tool for importing third party data into Historian.
You can import files using ANSI encoding only.
The File collector uses the ImportFiles folder for its operations, which is available in the Historian program folder. The ImportFiles folder contains the following subfolders:
Directory | Function |
---|---|
Error | If a CSV or XML file contains errors, the File collector will stop processing it and place the file in this folder. |
Incoming | Files that are to be processed by the File collector are placed here. |
Processed | Files that have been successfully imported by the File collector are placed here. |
Working | Files are placed in this folder while the File collector is importing their contents. |
To import a file, place it in the Incoming folder. At the beginning of the next cycle (one of the parameters entered by the user through Historian Administrator), the system initiates the file import operation, processes the data, stores the result in an archive, and moves the file from the Incoming folder to the Processed folder. During the processing operation, the file moves to the Working folder and the filename changes to YMDHMS-Data.csv or .xml file (for example, 010810103246-data.csv).
When processing is complete, the name changes back to YMDHMS-filename.csv or YMDHMS-file- name.xml, as appropriate (for example, 010810103246-tagtest3line.csv). If errors occur during processing, error messages are logged in the Filecollector_YMDHMS.log file (for example, FileCollector_01081214759.- log) within the LogFiles directory and the file moves to the Error directory.
When the number of days you specify in the Processed File Purge parameter have passed, the system deletes the imported file from the Processed folder. The Error directory is never cleared.
- Since a File collector is really an import function rather than a data collection operation, standard collector features such as compression, buffering, browsing, start/stop collection do not apply.
- If you are adding tags using the File collector, ensure that you specify the Collector Source Address and the Tag Source Address fields in the CSV file. If you do not, the Collector and Source Names are not added to the tags created by the File collector.
CSV File Format
The format for a CSV file is as follows:
*Comments
[Command]
Header (Keywords)
values (separated by commas)
*Comments
where Command = [Tags], [Data], [Messages], or [Alarms] and Keywords = (see the following list). Please note that the order of the data must match the order in the header. For example, if you are importing a tagname, timestamp, value, and quality you would use the following syntax:
[Data]
Tagname,TimeStamp,Value,DataQuality TIGER.IMPORT_TAG1.F_CV,7/20/01 11:07,1,Good
Data values for a single entry should be kept on one line.
While importing an Alarms and Events only the following headers are considered.
- Acked
- Acktime
- Actor
- AlarmID
- Condition
- DataSource
- Enabled
- EndTime
- EventCategory
- ItemID
- Message
- Quality
- Severity
- Source
- StartTime
- SubCondition
- Tagname
- Timestamp
The following table lists the available header keywords:
DataQuality | Value |
---|---|
Tagname | Description |
EngineeringUnits | Comment |
DataType | StringLength |
StoreMilliseconds | CollectorName |
CollectorType | SourceAddress |
CollectorType | CollectionInterval |
CollectionOffset | CollectionDisabled |
LoadBalancing | TimeStamp |
Type | TimeZoneBias |
HiEngineeringUnits | LoEngineeringUnits |
InputScaling | HiScale |
LoScale | CollectorCompression |
CollectorDeadbandPercentRange | CollectorCompressionTimeout |
ArchiveCompression | ArchiveDeadbandPercentRange |
ArchiveCompressionTimeout | Timeout |
CollectorGeneral1 | CollectorGeneral2 |
CollectorGeneral3 | CollectorGeneral4 |
CollectorGeneral5 | ReadSecurityGroup |
WriteSecurityGroup | AdministratorSecurityGroup |
Calculation | CalculationDependencies |
Acked | Condition |
SubCondition | EventCategory |
Message | Source |
Severity | StartTime |
EndTime | TimestampType |
SpikeLogic | SpikeLogicOverride |
InterfaceAbsoluteDeadband | InterfaceAbsoluteDeadbanding |
LastModified | LastModifiedUser |
ArchiveAbsoluteDeadband | ArchiveAbsoluteDeadbanding |
StepValue | Value |
SpikeLogic | SpikeLogicOverride |
NumberOfElements | CalcType |
- StringLength = 2 times the number of characters (ASCII, Single Byte). For example, "ABC" = 6 StringLength.
- TimeStamp: Timestamp resolution is in milliseconds for this collector. Do not try to import microsecond timestamps.
You cannot import Last Modified User, Last Modified Date, and Calculation Execution Time fields on a tag.
Example of CSV File that Imports Tags
* This is a comment
[Tags]
Tagname,Description,DataType,HiEngineeringUnits,LoEngineeringUnits
TIGER.IMPORT_TAG1.F_CV,Import Tag 1,SingleFloat,35000,0
TIGER.IMPORT_TAG2.F_CV,Import Tag 2,SingleFloat,35000,0
TIGER.IMPORT_TAG3.F_CV,Import Tag 3,SingleFloat,35000,0
TIGER.IMPORT_TAG4.F_CV,Import Tag 4,SingleFloat,35000,0
TIGER.IMPORT_TAG5.F_CV,Import Tag 5,SingleFloat,35000,0
TIGER.IMPORT_TAG6.F_CV,Import Tag 6,SingleFloat,35000,0
TIGER.IMPORT_TAG7.F_CV,Import Tag 7,SingleFloat,35000,0
TIGER.IMPORT_TAG8.F_CV,Import Tag 8,SingleFloat,35000,0
TIGER.IMPORT_TAG9.F_CV,Import Tag 9,SingleFloat,35000,0
TIGER.IMPORT_TAG10.F_CV,Import Tag 10,SingleFloat,35000,0
Example of a CSV File that Imports Data and Data Quality
* This is a comment
[Data]
Tagname,TimeStamp,Value,DataQuality
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:07,1,Good
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:08,2,Good
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:09,3,Good
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:10,4,Good
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:11,5,Bad
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:12,6,Bad
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:13,7,Good
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:14,8,Bad
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:15,9,Good
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:16,10,Good
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:17,11,Good
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:18,12,Good
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:19,13,Bad
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:20,14,Good
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:21,15,Good
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:22,16,Bad
TIGER.IMPORT_TAG1.F_CV,7/20/01 11:23,17,Good
Example of a CSV File that Imports Messages
[Messages]
TimeStamp,Topic,Username,MessageNumber,MessageString,Substitutions
28-Aug-2002 19:39:30.567,General,User1,0,A Test Message value 2 with milliseconds,,
28-Aug-2002 19:40:00.000,General,User1,0,A Test Message value 0,,
Example of a CSV File that Imports Tags with Step Values
[Tags]
Tagname,Description,DataType,HiEngineeringUnits,LoEngineeringUnits
TIGER.IMPORT_TAG1.F_CV,Import Tag 1,SingleFloat,35000,0
TIGER.IMPORT_TAG2.F_CV,Import Tag 2,SingleFloat,35000,0
TIGER.IMPORT_TAG3.F_CV,Import Tag 3,SingleFloat,35000,0
TIGER.IMPORT_TAG4.F_CV,Import Tag 4,SingleFloat,35000,0
TIGER.IMPORT_TAG5.F_CV,Import Tag 5,SingleFloat,35000,0
TIGER.IMPORT_TAG6.F_CV,Import Tag 6,SingleFloat,35000,0
TIGER.IMPORT_TAG7.F_CV,Import Tag 7,SingleFloat,35000,0
TIGER.IMPORT_TAG8.F_CV,Import Tag 8,SingleFloat,35000,0
TIGER.IMPORT_TAG9.F_CV,Import Tag 9,SingleFloat,35000,0
TIGER.IMPORT_TAG10.F_CV,Import Tag 10,SingleFloat,35000,0
Example of a CSV File that Imports Alarms
[Alarms]
DataSource,Condition,Source,StartTime,TimeStamp
FileCollector,HI,Mixer,3/10/2005 12:52:02,3/10/2005 12:52:02
FileCollector,HIHI,Mixer,3/10/2005 12:52:02,3/10/2005 12:59:12
Example of a CSV File that Imports Enumerated Set
[EnumeratedSet]
SetName,SetDescription,StateLowValue,StateHighValue,StateDescription,StateName,StateRawValueDataType,
LastModified,LastModifiedUser,AdministerSecurityGroup,NumberOfStatesInThisSet
TestSet5,TestDesc,1,10,State1Desc,State1,DoubleFloat,,,,2
TestSet5,TestDesc,11,20,State2Desc,State2,DoubleFloat,,,,2
Example of a CSV File that Imports Array Tags
[Tags]
Tagname,Description,DataType,HiEngineeringUnits,LoEngineeringUnits,NumberOfElements
TIGER.IMPORT_TAG1.F_CV,Import Tag 1,SingleFloat,35000,0,-1
TIGER.IMPORT_TAG2.F_CV,Import Tag 2,SingleFloat,35000,0,0
TIGER.IMPORT_TAG3.F_CV,Import Tag 3,SingleFloat,35000,0,-1
TIGER.IMPORT_TAG4.F_CV,Import Tag 4,SingleFloat,35000,0,-1
Example of a CSV File that Imports Array Tag Data
[Data]
Tagname,TimeStamp,Value,DataQuality
ArrayTag[0],6/11/2013 09:00:00,1,Good
ArrayTag[1],6/11/2013 09:00:00,2,Good
ArrayTag[2],6/11/2013 09:00:00,3,Good
ArrayTag[3],6/11/2013 09:00:00,4,Good
ArrayTag[0],6/11/2013 09:10:00,5,Good
ArrayTag[1],6/11/2013 09:10:00,6,Good
ArrayTag[2],6/11/2013 09:10:00,7,Good
Example of a CSV File that Imports MultiField Tag Data
[Data]
Tagname,TimeStamp,Value,DataQuality
MultiField.F1,05-22-2013 14:15:00,4,Good
MultiField.F1,05-22-2013 14:15:01,7,Good
MultiField.F1,05-22-2013 14:15:02,9,Good
MultiField.F2,05-22-2013 14:15:00,241,Good
MultiField.F2,05-22-2013 14:15:01,171,Good
MultiField.F2,05-22-2013 14:15:02,191,Good
Example of how to import a user defined type
[UserDefinedType]
UserDefinedTypeName,UserDefinedTypeDescription,FieldName,FieldDescription,FieldDataType,IsMasterField,NumberOfFields
UDT1,UDTdesc,Field1,F1desc,SingleInteger,FALSE,2
UDT1,UDTdesc,Field2,F2desc,DoubleInteger,FALSE,2
[Tags]
Tagname,Description,DataType,UserDefinedTypeName
Mfield,mfdesc,MultiField,UDT1
Example of a CSV File that Imports Python Expression Tags
[Tag]
Tagname,CollectorName,CalcType,SourceAddress,DataType,Description
TagDerivedFromRawValue,SimulationCollector,PythonExpr,"{""imports"":[""math""],
""script"":""tag.value + math.pow10,tag.value/70)"",""parameters"":[{""name"":
""tag"",""source"":{""address"":""Simulation00001"",""dataType"":""SingleFloat""}}]}",
SingleFloat,Python Expression Tag example
- Python Expression Tags do not support array or multifield tags.
- It is important to include the CalcType header and set it to PythonExpr for each Python Expression tag. If the file contains a mix of tags that are Python Expression Tags with those that are not, then the tags that are not should have the CalcType field set to Raw.
- For Python Expression Tags, the SourceAddress must contain the tag's minified JSON configuration constructed as described in the topic on Constructing the JSON Configuration for a Python Expression Tag. (Mini- fied JSON has no newline characters or comments. There are tools which can help you minify JSON.)
- Note that the example in the CSV file uses repeated quotation marks ("") in order to escape quotation marks ("), which are a special character.
- It is important to check that your JSON is valid, since no validation will be performed on the JSON at tag creation.
For more information on these tags, refer to the Python Expression Tags .
XML File Format
Format
The format for a XML file is as follows:
For a list of tags:
<Import>
<TagList>
<Tag>
<Tagname> .....</Tagname>
<Description> Test </Description>
</Tag>
</TagList>
</Import>
For Data:
<Import>
<Datalist>
<Tag>
<Data>
<Timestamp>..... </Timestamp>
<Value>.... </Value>
</Data>
</Tag>
</Datalist>
</Import>
For Messages :
<Import>
<MessageList>
<Data>
<Timestamp>..... </Timestamp>
<Topic>.... </Topic>
<Username>.... </Username>
<MessageNumber>.... </MessageNumber>
<MessageString>.... </MessageString>
<Substitutions>.... </Substitutions>
</Data>
</MessageList>
</Import>
Example of an XML File that Imports Tags
<Import>
<TagList Version="1.0.71">
<Tag Name="TIGER.IMPORT_TAG1.F_CV">
<Tagname>TIGER.IMPORT_TAG1.F_CV</Tagname>
<Description>Import Tag 1</Description>
<EngineeringUnits> PSI </EngineeringUnits>
</Tag>
<Tag Name="TIGER.IMPORT_TAG11.F_CV">
<Tagname>TIGER.IMPORT_TAG11.F_CV</Tagname>
<Description>Import Tag 1</Description>
<EngineeringUnits> PSI </EngineeringUnits>
</Tag>
<Tag Name="TIGER.IMPORT_TAG12.F_CV">
<Tagname>TIGER.IMPORT_TAG12.F_CV</Tagname>
<Description>Import Tag 2</Description>
<EngineeringUnits> PSI </EngineeringUnits>
</Tag>
<Tag Name="TIGER.IMPORT_TAG13.F_CV">
<Tagname>TIGER.IMPORT_TAG13.F_CV</Tagname>
<Description>Import Tag 3</Description>
<EngineeringUnits> PSI </EngineeringUnits>
</Tag>
<Tag Name="TIGER.IMPORT_TAG14.F_CV">
<Tagname>TIGER.IMPORT_TAG14.F_CV</Tagname>
<Description>Import Tag 4</Description>
<EngineeringUnits> PSI </EngineeringUnits>
</Tag>
<Tag Name="TIGER.IMPORT_TAG15.F_CV">
<Tagname>TIGER.IMPORT_TAG15.F_CV</Tagname>
<Description>Import Tag 5</Description>
<EngineeringUnits> PSI </EngineeringUnits>
</Tag>
<Tag Name="TIGER.IMPORT_TAG2.F_CV">
<Tagname>TIGER.IMPORT_TAG2.F_CV</Tagname>
<Description>Import Tag 2</Description>
<EngineeringUnits> PSI </EngineeringUnits>
</Tag>
<Tag Name="TIGER.IMPORT_TAG21.F_CV">
<Tagname>TIGER.IMPORT_TAG21.F_CV</Tagname>
<Description>Import Tag 1</Description>
<EngineeringUnits> PSI </EngineeringUnits>
</Tag>
</TagList>
</Import>
Example of an XML file that Imports Data and Data Quality
<Import>
<DataList Version="1.0.71">
<Tag Name="TIGER.IMPORT_TAG1.F_CV">
<Data>
<TimeStamp>20-Jul-2001 11:00:18.000</TimeStamp>
<Value>0</Value>
<DataQuality>Good</DataQuality>
</Data>
<Data>
<TimeStamp>20-Jul-2001 11:00:36.000</TimeStamp>
<Value>0</Value>
<DataQuality>Good</DataQuality>
</Data>
<Data>
<TimeStamp>20-Jul-2001 11:00:54.000</TimeStamp>
<Value>0</Value>
<DataQuality>Bad</DataQuality>
</Data>
<Data>
<TimeStamp>20-Jul-2001 11:01:12.000</TimeStamp>
<Value>0</Value>
<DataQuality>Good</DataQuality>
</Data>
</Tag>
</DataList>
</Import>
Example of an XML file that Imports Messages
<Import>
<MessageList Version="1.0.71">
<Data>
<TimeStamp>28-Aug-2002 19:42:00.000</TimeStamp>
<Topic>General</Topic>
<Username>XMLUser</Username>
<MessageNumber>0</MessageNumber>
<MessageString>Another test message</MessageString>
<Substitutions></Substitutions>
</Data>
<Data>
<TimeStamp>28-Aug-2002 19:48:00.000</TimeStamp>
<Topic>General</Topic>
<Username>XMLUser</Username>
<MessageNumber>1</MessageNumber>
<MessageString>Message One</MessageString>
<Substitutions></Substitutions>
</Data>
</MessageList>
</Import>
Example of an XML file that Imports a Tag with a Step Value
<Import>
<TagList Version="1.0.71">
<Tag Name="TAG6">
<Tagname>TAG6</Tagname>
<StepValue> TRUE </StepValue>
</Tag>
</TagList>
</Import>
Example of an XML file that Imports Alarms
<Import>
<AlarmList Version="1.0.71">
<Alarm>
<Attribute name="Acked" value="false"/>
<Attribute name="Actor" value="TheActor"/>
<Attribute name="Condition" value="Condition"/>
<Attribute name="DataSource" value="File collector"/>
<Attribute name="Enabled" value="true"/>
<Attribute name="EndTime" value="12/25/2005 12:47:59.003"/>
<Attribute name="EventCategory" value="Process"/>
<Attribute name="Message" value="My message."/>
<Attribute name="Quality" value="Good"/>
<Attribute name="Severity" value="250"/>
<Attribute name="Source" value="SourceXML000003"/>
<Attribute name="StartTime" value="12/25/2005 12:47:59.003"/>
<Attribute name="SubCondition" value="Hi"/>
<Attribute name="TagName" value="TheTagName"/>
<Attribute name="Timestamp" value="12/09/2005 12:47:59.003"/>
</Alarm>
<Alarm>
<Attribute name="Acked" value="false"/>
<Attribute name="Actor" value="TheActor"/>
<Attribute name="Condition" value="Condition"/>
<Attribute name="DataSource" value="File collector"/>
<Attribute name="Enabled" value="true"/>
<Attribute name="EndTime" value="12/25/2005 12:47:59.004"/>
<Attribute name="EventCategory" value="Process"/>
<Attribute name="Message" value="My message."/>
<Attribute name="Quality" value="Good"/>
<Attribute name="Severity" value="250"/>
<Attribute name="Source" value="SourceXML000004"/>
<Attribute name="StartTime" value="12/25/2005 12:47:59.004"/>
<Attribute name="SubCondition" value="Hi"/>
<Attribute name="TagName" value="TheTagName"/>
<Attribute name="Timestamp" value="12/25/2005 12:47:59.004"/>
</Alarm>
</AlarmList>
</Import>
Example of an XML file that Imports Enumerated Set
<Import>
<EnumeratedSetList>
<EnumeratedSet SetName="TestSet7">
<EnumeratedState StateName="State1">
<SetName>TestSet7</SetName>
<SetDescription>TestDesc</SetDescription>
<StateLowRawValue>1</StateLowRawValue>
<StateHighRawValue>10</StateHighRawValue>
<StateDescription>State1Desc</StateDescription>
<StateName>State1</StateName>
<StateRawValueDataType>DoubleFloat</StateRawValueDataType>
<LastModified>9/17/2012 16:37:48.260000</LastModified>
<LastModifiedUser>GECORPORATE\312006949</LastModifiedUser>
<AdministratorSecurityGroup></AdministratorSecurityGroup>
<NumberOfStatesInThisSet>2</NumberOfStatesInThisSet>
</EnumeratedState>
<EnumeratedState StateName="State2">
<SetName>TestSet7</SetName>
<SetDescription>TestDesc</SetDescription>
<StateLowRawValue>11</StateLowRawValue>
<StateHighRawValue>20</StateHighRawValue>
<StateDescription>State2Desc</StateDescription>
<StateName>State2</StateName>
<StateRawValueDataType>DoubleFloat</StateRawValueDataType>
<LastModified>9/17/2012 16:37:48.260000</LastModified>
<LastModifiedUser>GECORPORATE\312006949</LastModifiedUser>
<AdministratorSecurityGroup></AdministratorSecurityGroup>
<NumberOfStatesInThisSet>2</NumberOfStatesInThisSet>
</EnumeratedState>
</EnumeratedSet>
</EnumeratedSetList>
</Import>
Example of an XML File that Imports Array Tags
<Import>
<TagList Version="1.0.71">
<Tag Name="ArrayTag1">
<Tagname>ArrayTag</Tagname>
<Description>Import array Tag 1</Description>
<EngineeringUnits> PSI </EngineeringUnits>
<NumberOfElements>-1</NumberOfElements>
</Tag>
<Tag Name="ArrayTag2">
<Tagname>ArrayTag2</Tagname>
<Description>Import array Tag 2</Description>
<EngineeringUnits> PSI </EngineeringUnits>
<NumberOfElements>-1</NumberOfElements>
</Tag>
</TagList>
</Import>
Example of an XML File that Imports Array Tag data
<Import>
<DataList Version="1.0.71">
<Tag Name="ArrayTag[0]">
<Data>
<TimeStamp>11-June-2013 11:00:18.000</TimeStamp>
<Value>1</Value>
<DataQuality>Good</DataQuality>
</Data>
<Data>
<TimeStamp>11-June-2013 11:01:18.000</TimeStamp>
<Value>2</Value>
<DataQuality>Good</DataQuality>
</Data>
</Tag>
<Tag Name="ArrayTag[1]">
<Data>
<TimeStamp>11-June-2013 11:00:18.000</TimeStamp>
<Value>3</Value>
<DataQuality>Good</DataQuality>
</Data>
<Data>
<TimeStamp>11-June-2013 11:01:18.000</TimeStamp>
<Value>4</Value>
<DataQuality>Good</DataQuality>
</Data>
</Tag>
</DataList>
</Import>
Example of an XML File that Imports Python Expression Tags
<Import>
<TagList Version="1.0.71">
<Tag Name="TagDerivedFromRawValue">
<Tagname>TagDerivedFromRawValue</Tagname>
<CollectorName>OpcServerCollector</CollectorName>
<SourceAddress>
"{"imports":["math"],"script":"tag.value + math.pow(10,tag.value/70)","parameters":
[{"name":"tag","source":{"address":"IO/READONLY/Temperature","dataType":"DoubleFloat"}}]}"
</SourceAddress>
<CalcType>PythonExpr</CalcType>
<DataType>DoubleFloat</DataType>
<Description>Python Expression Tag example</Description>
</Tag>
</TagList>
</Import>
- Python Expression Tags do not support array or multifield tags.
- It is important to include the CalcType header and set it to PythonExpr for each Python Expression tag. If the file contains a mix of tags that are Python Expression Tags with those that are not, then the tags that are not should have the CalcType field set to Raw.
- For Python Expression Tags, the SourceAddress must contain the tag's minified JSON configuration constructed as described in the topic on Constructing the JSON Configuration for a Python Expression Tag. (Minified JSON has no newline characters or comments. There are tools which can help you minify JSON.)
- It is important to check that your JSON is valid, since no validation will be performed on the JSON at tag creation.
For more information about these tags, refer to the Python Expression Tags.
Summary of File Collector Features
The following table outlines the features of the File collector.
Feature | Capacity |
---|---|
Browse Source For Tags | No |
Browse Source For Tag Attributes | No |
Polled Collection | No |
Maximum Poll Rate | N/A |
Unsolicited Collection | No |
Time Stamp Resolution | 1 ms |
Accept Device Timestamps | Yes |
Floating Point Data | Yes |
Integer Data | Yes |
String Data | Yes |
Binary Data | No |
Import CSV Files | Yes |
Import XML Files | Yes |
Import multiple files with various extensions | Yes |
Collector compression | No |
Reads data, tags and messages | Yes |
Start/stop collection from Historian administrator | No, scanned at user specified interval. Set to 0 to stop. |
Set file specs and import interval from Historian Administrator. (Timestamps for data or messages may be at intervals less than the import interval.) | Yes |
Python Expression Tags | No |
Create Historian Tags | Yes. Note: The File collector can be used to create Python Expression Tags for those collectors that support them. See the Python Expression Tags book for a list of collectors that support Python Expression Tags. |
File Encoding | ANSI |
The Configuration Section for File Collectors
To access the Configuration section for a File collector, select a File collector from the list on the left and select Configuration. The page shown in the following figure appears.
Collector-Specific Configuration (File)
The Configuration section displays the following information.
Field | Description |
---|---|
Scan Interval | The rate at which the input directory, Historian\ImportFiles, is scanned for any new files. To change it, enter a new value. The scan interval cannot exceed 65 seconds. Note: Changes to the Scan Interval do not take effect until the File collector is restarted
|
CSV File Specifications | The file extension for the CSV file being imported. You can specify more than one extension type, such as: csv, txt, dat. |
XML File Specification | The file extension for the XML file being imported. |
Purge Processed Files (days) | The number of days a file can reside in the Processed Files directory before being deleted automatically. |
Purge Error Files (days) | The number of days an error file is retained before being deleted automatically. The default is 10. |
String Data | Yes |
Troubleshooting the File Collector
If you are experiencing any problems with the File collector, use the .LOG file (in the \LogFiles folder) to troubleshoot. The .LOG file sometimes logs errors that do not get processed to Historian Administrator. For example, if you have no archives in your system and you attempt to import a .CSV file with formatting errors, the file is not processed and no alerts are sent to Historian Administrator (if there are no archives created, the message database has not been created). But this error does appear in the .LOG file.
The following table lists typical Error messages received with the File collector and tips to troubleshoot those messages.
Typical Error Messages and Suggested Solutions
Error Message | Troubleshoot by... |
---|---|
12-Aug-0113:38:00 - Import Line Error: Input past end of file 12-Aug-0113:38:00 - Error Occurred On Line12: General Format Error | There is an extra line in the file. Open the file in Note- pad and remove the extra line. |
10-Aug-01 15:47:02 - Import Line Error: Type mismatch 10-Aug-01 15:47:02 - Error Occurred On Line 2: General Format Error | Two fields were rolled into one due to a missing comma. Open the file in Notepad and add the comma. |
12-Aug-01 14:18:12 - Invalid Import Field: TimeResolution 12-Aug-01 14:18:12 - zProcessFragment>> Aborted Import Due To Formatting Errors | If an import field is invalid, (in this case Time resolution should be StoreMilliseconds) the import will abort. |
Identifying Lines in which an Error Occurs
If the File collector log file contains an error description such as:
20-Jul-01 10:1717 - Import Line Error: Input past end of file
20-Jul-01 10:1717 - Error occurred on Line 7:General Format Error
the line number in the log is produced by the SDK, not the File collector. This means that the line number is counted relative to the header or field list for each section of the file, ignoring comments and blank lines.
CSV File Imports
If a .CSV file has an extra line in it, it may not successfully import using the File collector. If a .CSV file has extra commas on the data line, it may not import completely.
Typically, a .CSV file must be less than 10 MB and an ideal file should be 1 or 2 MB. If the file size is greater, then the File collector may not respond.
If you view these files through Microsoft Excel, the characters are not visible. It is recommended that you use a text editor to examine files that are causing format errors when you attempt to import them.
You cannot import CSV or XML data that goes back beyond the Archive Active Hours setting (1 month by default). Adjust your Archive Active Hours setting and re-import the data.
Also, data before the first archive will not be imported.
Troubleshooting Large File Import
To prevent a locked file scenario when building large files for import into the Incoming directory, first build the file under a temporary file name or directory that will be ignored by the File collector, then rename or move the file to the real file name or Import directory when the file is fully built.
Unable to Access Historian Administrator After Installing File collector
- The previous version of the collector will be in the stopped state and the new version of the collector will be in the running state. In the previous installs of Historian 7.0 (SP2\SP3\SP4), the File collector name was indicated using the destinationNode_Collector. But from SP5 onwards, the collector name is indicated as sourceNode_Collector.
- The data collection will be stopped for the tags which were added earlier to the File collector.
- you must change the collector to the newer version in Tag properties for all the tags which were added using the older version of the collector.
- delete the older version of the collector from the Collectors section in Historian Administrator.
Configuration of File Collector Specific Fields
Enter values for the File collector-specific field parameters through the File Collector Maintenance - Configuration section of Historian Administrator.
The collector-specific field descriptions are listed in the following table.
Field | Description |
---|---|
ScanInterval (seconds) |
Collector initiates import operation at beginning of the scan interval specified in this field.
Note: Changes to the Scan Interval do not take effect until the File collector is restarted. |
CSV File Specification | The file extension for a CSV file to be imported. |
XML File Specification | The file extension for an XML file to be imported. |
Purge Processed Files After(days) |
The contents of the Processed Files folder is automatically purged after the number of days specified in this field. |
Purge Error Files After (days) | The contents of the Error Files folder is automatically purged after the number of days specified in this field. |
Date Formats Supported by the File Collector
System Date Format | Supported Date Format in the XML Files | Supported Date Format in the CSV Files |
---|---|---|
dd-MMM-yy |
|
|
yyyy-MM-dd |
|
|
yy/MM/dd |
|
|
MM/dd/yyyy |
|
|
MM/dd/yy |
|
|
M/d/yy |
|
|
M/d/yyyy |
|
|