Please enable JavaScript.
Coggle requires JavaScript to display documents.
Azure Data Factory
ETL, NonCattleComDataLoad
External Data Sources
Some…
Azure Data Factory
ETL
Build Dimensions
Build Facts
RPT_DailyFeedings
R Scripts
n ~ 35 known at this point
- not all needed for customer instance
-
There are 3 huge R scripts for Closeouts
One large script that is inline in a SQL Agent JobSome R scripts only go and get data from Spreadsheets and put into RPT or Stage Tables
Usually data is put into a stage table and then a merge-update stored proc is ran to pull from stage to insert into an RPTMany R scripts are labeled as ALCC specific
- back to are we going to have an ELT specific for ALCC and for Revoro?
-
CattleCom Tables
LoadCattleComOtherStage
LoadCattleComOtherStage2
LoadCattleComStage
LoadCattleComStage2
-
-
-
-
-
-
Getting data from DW back into CC - 1)Seg ADGs for sort - is basically one input (several models that run on DW data)
Pump data to Warehouse - Model is ran, data is put into a temp table in CC, then Jeremy has a job that updates the live table in CC with those values2)Seg Closeouts (Closeout report available in CC that BG's use) - could we create and embedded report for this - we would need to hand a location to the report3)TagsUpdateGenomicTypeID - goes into a stage genomic results table (Kirby creates the answer from an R script) - genomic samples - this is what tells it to sort cattle correctly when they have a genomic type. Genomic predictions from them and meet a four way classification
- Ideas - use powershell to update these things
- Day 1 solution point R script to Cloud CattleCom
-
-
-
*Many things are keyed off of the CODE and the ID - Example Diagnoses
- If you have the same code at two locations it can mess up the process
- Suggestion would be to create a DW from each CC instance for each FY and then for the CO