use of caats at daikin europe
TRANSCRIPT
Use of CAATT’s at Daikin Europe
Erik Claes
•About Daikin Europe N.V.•Why start data analytics in audit?•How we got started•The flow of data•Tools used•SAP data extraction ; Why keep a copy?•Issues with data•Creating a process for requesting data•Example of a script•Cases
} Daikin is the largest A/c manufacturer in the world with 15 billion USD in turnover.
} Japanese company based in Osaka, Japan} Over 50.000 employees worldwide} Daikin Europe N.V. is headquartered in
Brussels and covers EMEA region} In Europe: 5 production facilities, 17 affiliated
companies, five sales offices and a whole network of independent distributors.
} Sampling does not provide sufficient assurance
} Ask more specific questions about an exception
} Cannot cover everything with small audit dept} Stories about duplicate payments recovering
millions of Euros} More automation on sampling (where full
testing cannot be done)
} Used MS Access in 2004 to compare some tables
} Found Access not sufficient and moved to ACL} Used external company to get standard scripts
(e.g. duplicate payments)} Mixed feelings
◦ Recovered 70K over a period of 3 years◦ Internal controls are working well
} Took about 2 months to get results for duplicate payments
} Credit control scripts were created} Started with downloading data from SAP more
frequently for analysis} More usage…
(SAP) Database
Extractor
Local copy
Analysis Scripts
Reports• Audit use
} Connection to SAP (e.g. login / password, RFC)} Extraction: DAB Exporter} Analytics scripts (both development and
execution) - Arbutus Analyser} Data format: Flat CSV files. Naming convention
is based on the SAP table names.} Reports: Arbutus format and Excel} Audit Management: Vision
} Getting data from SAP tables} Number and size was a big issue
◦ Which ones to take◦ Which fields to choose◦ How many records to download and how frequently◦ Are records updated in SAP or not (e.g. difference
between CD tables and other tables)
} Extraction using a tool called DAB Exporter} Slice generation, Formatting, archiving is done
through scripting in Arbutus (similar to ACL)} Size of disk issue – nearly 2 TB now} Achieving automatic downloads was a goal in
itself.} Data slices are automatically downloaded
every month} Time period to stabilise this took over a year
} Reading tables takes a lot of I/O (input/output)
} I/O on the server will affect business users} Daikin IT intends to implement archiving of
data older than 3 years of age.
} Extraction time is large} Only incremental extracts (monthly deltas)} No capability to roll back; } Regular tables have to be downloaded for long
time ranges (Current FY and two previous FY)} Flat files have no indexes} Speed of data access is slow because of large
volume (nearly 2 TB)
} Many aspects to think of even before starting to pull data◦ Which tables◦ Which fields◦ What are the “other” possibilities within the business
process◦ What do I want to see as a result (which tables and fields)◦ How to test the result I get
} Naming conventions used◦ Filing data requests◦ Result file names◦ Script names
} Example of a Data Request template
Request Information Name of audit: Category/chapter Requestor name: A similar request exists (also in SAP interesting transactions)?
Yes/no
Title of the request Period for data Company code Other filters Objective Control / Workplan step to be attached to
Output fields/columns that are mandatory: [order of fields may also be mentioned]
Additional information Does this have to be repeated
yes/no
Type of result Exception list/ sample (Population) Rollout for self-assessment
yes/no
Target date (only add dates do not delete)
Result Information
Name of analyser John Process Source files/tables Script Result files Result Location How long it will take Time taken for the request
Change log
-
Sign Off Accepted / stopped / rejected: explain Name and Date: To be completed by requestor
Template update: 28.08.2014
} To find accounting postings that were done in a closed period
} The result files} The script
◦ The period definition file in Excel◦ The steps inside◦ Split the file◦ Clean up and close
} Find people who are not authorised to change credit data for customers
} Extract all the changes to credit data (CDPOS and CDHDR tables with Object Class as ‘KLIM’)
} Extract all the people who made these changes} Extract people who are authorised to make
credit data changes (AGR_USERS table filter using the term “CREDIT”)
} Join the people who made changes with the authorised people (unmatched)
} Split the file by country code and move to required folder
} Manual postings are those that are not done via an automatic batch run
} Only took P/L accounts} Asked IT to identify manual postings à look
for records with “RFBU”. } Later found manual postings without “RFBU”} Looked at time stamps (for a particular user
and company code) : 1 second rule for automatic postings
} Removal of postings to G/L accounts with automatic flag
} Number of records is around 1.5 million} Removal of “not relevant” transactions by
manual selection} A few thousands per company} Refinement still in progress!
• Questions?• You may mail me at [email protected]