To complete this exercise, first you will need to download and save four files to your computer: Q1.csv, Q2.csv, Q3.csv, Q4.csv. See attachments.
Many Arcadia users are interested in using Devo to analyze some of their static, historical data saved in files. These files are usually in CSV or TSV format or at least contain event fields separated by delimiters of some kind.
In this case, the VP of Sales wants to upload the quarterly sales transactions from his key accounts team so that he can later generate an annual report of 2020 sales to key accounts.
To ingest this type of data, we can leverage the Data upload tool.
Select this entry in the navigation pane. Then, choose to upload a local file.
The first two tag levels are fixed as my.upload. In addition to that, you can use the 3rd and 4th levels to identify the content of your log file.
Enter the 3rd and 4th tag levels following this model:
The Date parsing type can be either the current date or Devo can use the date from a field in the file. If you choose the latter, your file must contain some field values that follow a timestamp format.
Choose Date from a log field as the Date parsing type.
Select Delimiter as the Parsing log type and then a comma (,) as Delimiter.
Based on these last selections, Devo can check the events so that you can indicate which field you want to use as the key event date.
Select the field that represents the event date. Finally, you can confirm and submit the log file.
You’ve successfully uploaded the first of four files.
Repeat the steps and upload the other three files.
When you have uploaded the final of the four files, wait a few minutes, then go to Data search and see if you can locate the new data tables in the Finder.
💡 Hint: Use the Refresh tool in the Finder.
After following the steps, I do find my new data tables in the finder:
It took me a while to understand how to proceed with the search dates in order to be able to see the data, but here they are now:
When trying to upload the file I am getting an undefined error.
I could open the CSV locally and can see nothing wrong with the File.
Any specific reason why this could be? I am working in the Lab env hosted in the APAC region with Admin rights.
Hi Tipsy! Sending you a DM to avoid getting this feed full of troubleshooting issues.
Is there a way to parse custom data in a structured way for custom logs?
As far as I know we can workaround it with “peek() as fieldname”, but I would appreciate a UI to parse a full log with its columns, etc. That way, we wouldn’t bother you guys with this routine stuff.
Yes, we have an Autoparser. You can review the details in our documentation and see if that works for your use case.