Article Number
000033219
Applies To
Product(s): ArcherVersion(s): All VersionsPrimary Deployment: On Premises
Description
When running the Archer Configuration Reports (ACR) from the Archer Control Panel (ACP), the tblDatafeedMessage database table can be quite large and be among the top 10 for row count or size.
This can happen when one or more of the following conditions is true:
- A large number of Data Feeds
- Frequently running Data Feeds (every 5 minutes)
- Data Feeds that update a significant volume of data each run.
By default, Archer does not automatically purge old data from the tblDatafeedMessage table. The table contains the Run Detail Messages from a Data Feed's Execution History. Purging old rows from this table can help reduce the overall size of the database.
Sample image from ACR:
0EMVM00000CebbS.jpg Resolution
1. Open SQL Server Management Studio and expand the SQL Server Agent node in the Object Explorer pane:
0EMVM00000CeaHD.jpg
2. Right-click on Jobs and select New Job
3. A popup will appear. Enter in a name for the job (ie DataFeedMessagePurge )
4. Make sure the Enabled checkbox is selected and click the Steps page in the left pane
5. Click the New… button at the bottom of the page to create a new step for the job
6. Provide a step name (ie DataFeedMessagePurge )
7. Leave Type and Run as as the default and choose your Archer instance database from the database dropdown
8. Paste the following statement into the Command window: *** See notes at end of article to adjust number of days to retain ***
DECLARE @count int
SET @count = 10000
DELETE FROM tblDatafeedMessage
WHERE datafeed_message_id IN (
SELECT TOP (@count) datafeed_message_id
FROM tblDataFeedMessage
WHERE create_date < DATEDIFF(DAY,7,GETUTCDATE())
)
0EMVM00000Ceg85.png
- Click OK to save the step
- Choose Schedules from the left pane
- Click the New… button
- Provide a Schedule name (ie DataFeedMessagePurge)
- Make sure the following settings are in place:
- Schedule type: Recurring
- Enabled=checked
- Frequency=Weekly
- Recurs every=1 week
- Sunday is checked
- Occurs once at: 3:00 AM
- Start date={todays date}
- No end date selected
- Click Ok to save the schedule.
- Click Ok to save the job.
Notes
- You can choose to run this job to run as frequently as you would like, however the table won’t grow to massive amounts very quickly, especially if you don’t have a large number of Data Feeds running in this environment. You can also alter the number of days it will go back and clear old messages. In the example above, it's currently set to remove anything older than 7 days in the script.
- Typically the only data in this table that may be important is for the last run data for each Data Feed. This is where the Data Feed "Run Detail' data in the front end is pulled from. Check the Data Feed schedules in your instance and configure this job to only purge data older than the value of your most infrequently scheduled Data Feed. For example if your most infrequently scheduled Data Feed runs every two weeks then configure the job to only purge data older than two weeks. In the example above, this would look like:
WHERE create_date < DATEDIFF(DAY,14,GETUTCDATE())
-
As written, this job only cleans up entries in groups of ten thousand at a time. In cases where there are large volumes of rows to delete, this process will take a long time to clean up entries. You may consider increasing the 10,000 value to larger volumes or completely truncate the table before implementing and scheduling this job to run on a regular basis.