Download - Pushing translation quality upstream (Klaus Fleischman, Managing Director of Kaleidoscope GmbH)
Collaboration: Pushing theQuality Process Upstream
Klaus Fleischmann,TAUS RT Vienna
Agenda / Goal
• The trouble with traditional quality control• Shifting the Quality Process Upstream
Quality checking
QA check tools• Purely formalistic
checks• Many "false
positives"
In-country review • Rarely professional copy-
writers• Rarely trained linguists• No time• Unpredictable interest• No handle
Confrontation or Collaboration
• No actionable feedback– "Translations are always bad!"– "Sounds like Google translate"– “Surely that was not a native speaker”
• Leads to defensive reactions, not solutions• Not a true quality perspective, just a snap
shot
Confrontation or Collaboration
Client
Reviewer
opinion
Translator
opinionLSP
opinion
Other Challenges of Review
• Review without context• File format issues• Linguistic issues• No process integration• Inserting corrections
Tim Martin
Review will not rescue a bad translation!• Review […] “alone is an imperfect art and can never ensure that an intrinsically
bad product will be rendered flawless. Nor indeed should it be seen merely as a form of corrective action.
• Its real strength and investment value is as a feedback tool that allows its results to be channeled back into the whole cycle of translation production in order to eliminate or reduce problems at source.”
• Tim MartinSenior staff member of the European Commission's Directorate-General for Translation
Common Sense Advisory
Review leads to delays and frustration:
• Client language reviews – often called in-country or third-party reviews – are notorious for causing delays and frustrations for all parties involved.
• Reviewers may alter the meaning of translations, introduce mistakes, fall into an editing black hole, or sit on review files for months.
May 29, 2015, “Rethinking Client Language Review”, CSA
Agenda / Goal
• The trouble with traditional quality control• Shifting the Quality Process Upstream
Defining quality
• ASTM* definition:– The degree to which the characteristics of a translation fulfill the
requirements of the agreed-upon specifications (3.1.45)• Alan Melby:
– A quality translation demonstrates accuracy and fluency required for the audience and purpose and complies with all other specifications negotiated between the requester and provider, taking into account both requester goals and end-user needs.
• * American Society for Testing and Materials
Defining quality
• So what we need are– Requirements– Specifications
• And to be able to– Calculate error scores– Define Pass/Fail-benchmarks– Track error scores over time
Requirements
• DQF and Content Profiles• Harmonized error categories• Combined with scoring, tracking
• Gives us business intelligence and objectivity• Allows us to be pro-active rather than re-active• Raises customer involvement
Smart Sampling
• Smart Sampling calculates relevant samples based on– Match rates– Context– Content– XML / formatting
Quality assessment by Reviewers
Quality tracking by PM
Objective Error Assessment
• Business intelligence • Objectivity• Pro-active instead of re-active• Early customer participation!
Collaboration beats confrontation
Shifting the TEP paradigm
• Moving the process upstream– BEFORE translation:
as much as possible– DURING translation:
routinely– AFTER translation:
as little as possible
Customer involvement
• BEFORE– Manage terminology – Develop style guides– Hold project kick-offs– Define requirements
(content profiling)
Customer involvement
• During translation– Actively resolve
queries– Define conflict resolution
scenarios– Direct contact between
reviewers and translators
Customer involvement
• After translation– Do not “review”, but
assess samples– Saves time, brings
objectivity– Track translation quality– Strategically attack
problematic fields
Redefining Review – Soft Facts
• Direct interaction between translators and reviewers!
• Feedback is not always a complaint – Valid for translators,
LSPs and clients!• Cost of unnecessary
corrections
• Conflict resolution process:– For real problems– For massive preferential
changes (Transcreation? A-B testing?)
• Responsible for final text?– Translators are language
specialists– Reviewers are content and
technical specialists
Advantages
• Objective quality tracking• "Documented" quality track record• Strategic instrument for assuring (!) quality
rather than snap shots and escalation• Direct feedback loop to translators
-> quality improvement
Bottom-line
• Collaboration beats confrontation• Strategic quality assurance beats quality
control• Customer becomes a partner
– Less work overall– More strategic work– Objective quality– Pro-actively identify improvement measures