GEFMA Guidelines: The Standard for Professional Facility Management

The GEFMA guidelines are in Germany of practical standards for professional facility management and provide the operational specifications that can be translated into CAFM systems. This article shows which GEFMA documents, especially GEFMA 100, are relevant for CAFM projects, how standards can be mapped to data models, interfaces, and SLAs, and which implementation steps should not be missing from your implementation roadmap. Practical checklists, mapping templates, and notes on risks and audit criteria will help you implement GEFMA-compliant processes with minimal customization.

Relevance of GEFMA Guidelines for Professional Facility Management

GEFMA guidelines are not a theoretical framework, but an operational benchmark: they provide concrete specifications for terms, service descriptions, and KPIs that can be directly translated into CAFM data models and processes. Anyone managing facility services, building operations, or building maintenance in Germany will repeatedly encounter requirements in their daily business that are documented in GEFMA; the collection of GEFMA.de is the practical reference.

Most important pragmatism: select, don't adopt everything. GEFMA 100 is suitable as an introduction for a uniform understanding of terms, GEFMA 190 provides concrete specifications for service descriptions. Complete adoption of all guidelines often leads to unnecessary project scope; a tiered approach that maps critical elements first and integrates supplementary GEFMA parts later is better.

Specific GEFMA Elements with Direct Impact on CAFM Implementations

  • Data basis and terms: uniform room identifiers, area terms, and asset identifications as a prerequisite for reliable reporting.
  • Service catalog structure: standardized service items reduce disputes in billing and facilitate tenders.
  • KPIs and measurement rules: Availability, response, and restoration times according to GEFMA allow comparable SLA measurements.
  • Documentation and audit: Requirements for Gefma documentation and traceability support compliance and later audits.

Concrete example: A medium-sized operator with approximately 250 locations used GEFMA 100 and GEFMA 190 to create room identifications and a standardized service catalog. The result: clear responsibilities in maintenance orders, reduced inquiries to service providers, and faster monthly billing.

Trade-off and practical limitation: GEFMA sets the direction, but IT systems and providers interpret fields differently. IFC/BIM integrations, REST APIs, and internal cost structures require mapping decisions. In practice, a governance-first approach works better than a dogmatic mapping of all GEFMA fields: prioritize mandatory fields, automate validation, and avoid over-customization in the system.

Start with a Minimum Viable GEFMA Mapping: core terms, service catalog, and 3-5 KPIs. Governance and data quality create more value than complete regulatory compliance at the start of the project.

Recommendation: Appoint a small, interdisciplinary working group (FM, IT, Procurement), conduct a gap analysis against GEFMA 100/190, and prioritize a maximum of ten mandatory data fields for the first CAFM rollout phase.

Core Content of GEFMA Affecting CAFM

GEFMA provides the semantic building blocks, not the technical data model. This is the central point for CAFM projects: GEFMA defines terms, service items, and measurement rules, but does not provide binding table schemas. The practical consequence is a mapping problem – you have to decide which GEFMA attributes should be managed as mandatory fields, which as optional fields, and which only as documentation references in the CAFM.

Specific elements you should map in CAFM

GEFMA Element Exemplary illustration in CAFM
Terminology and Space Definitions LocationID, Usage Category, Area Class (according to DIN), Responsible Department
Service Catalog and Service Items ServiceCode, Service Description, Billing Type, Initial Processing Time
KPI Definitions KPI ID, Measurement Method, Calculation Formula, Measurement Interval
Documentation and Audit Evidence Document Link, Audit Date, Auditor, Audit Status
Energy and Sustainability Requirements Meter ID, Measurement Point, Measurement Value History, Emission Factor

Practical consideration: Fine-grained service items increase comparability but drive maintenance effort and contract complexity. In projects, it has proven effective to tier the service catalog depth along operational responsibility – supplier-relevant details in service allocation, aggregated items for internal reporting.

  • Prioritize fields by their impact: Fields that control billing, liability, or response times, first.
  • Define measurement methods before activating KPIs: Measurement frequency and source determine automation effort.
  • Assign Canonical IDs: a unique identifier for location, asset, and service prevents duplicates at interfaces.
  • Plan maintenance intervals: Who updates area data, who validates meter readings, how often.

Concrete example from practice: On a university campus, the CAFM was extended to integrate occupancy data from sensors, room type according to GEFMA concept, and energy meters. Result: heating times were reduced by 12 percent on a zone-by-zone basis because CAFM reports automatically generated circuit diagrams and maintenance orders went to the correct building operations manager.

Important: GEFMA provides binding measurement rules for KPIs. If your CAFM reports use different measurement intervals or aggregation rules than described in the guideline, you lose comparability.

Tactical suggestion: In a two- to four-week workshop phase, introduce a mapping subset – ten fields for Location/Asset, five fields for Service, and two KPIs. Test interfaces with real export files and an acceptance scorecard.

Practical insight: Complete GEFMA compliance is rarely a sensible starting goal. It is more important to operationalize those GEFMA elements that directly influence costs, liability, and control. As a next step, you should plan a mapping workshop with FM, IT, and purchasing and supplement the first RFP formulation with the prioritized data fields and KPI measurement rules. Further details on GEFMA 100 can be found on the official website of the GEFMA.

GEFMA in Practice: Mapping to CAFM Data Models and Interfaces

I claim: Mapping is not a technical field-to-field translation, but a governance decision: who owns which data, how often is it updated, and which source is considered the master. Without these decisions, the cleanest IFCexport file only delivers inconsistency in operations.

Practical mapping template and important columns

Mapping Template (briefly explained): A usable template contains at least these columns: GEFMA Attribute, CAFM Field Name, Data Type, Mandatory/Optional, Validation Rule, Source System and Update frequency. Create the columns in a shared document and version it in your project repository.

CAFM Field Example Validation Interface Source
LocationID alphanumeric 10 characters; unique per location IFC / Master Data Repository
AssetTag Barcode available; Linkable with maintenance orders ERP / Mobile Inspection App
ServiceCode Must exist in service catalog; GEFMA-compliant numbering CAFM / Service Directory (external)
ContractID Check validity date; Linkable KPIs Purchasing System / Contract Management
KPI_AVAILABLE Source and measurement interval specified; Aggregation rule documented CAFM / IoT or MES System

trade-off that you need to make: Either you mirror detailed IFC-properties 1:1 into CAFM and pay the maintenance costs, or you extract only those attributes that influence billing, liability, and KPI measurement. In practice, the second model generates value faster; the first provides more analytics but only works with clear data maintenance processes.

Concrete example from practice: A logistics service provider with 120 warehouse locations integrated BIM geometry for space management but kept technical asset attributes in the ERP. The CAFM only received validated field values (LocationID, AssetTag, Maintenance Interval) via API. Result: 40 percent fewer manual entries for maintenance orders and significantly cleaner SLA evaluations.

Interface strategy: Favor lightweight, documented APIs (REST/JSON) for dynamic data such as occupancy and meter readings; use batch exports (CSV/Excel) for one-time master data migrations. Plan a small transformation layer that harmonizes source formats, generates Canonical IDs, and reports validation errors.

What provider evaluations should really test: Request actual export samples and a small POC from the RFP that plays through the mapping for three critical fields and one KPI. The number of fields is not decisive, but whether providers recognize faulty data, reject it, and provide clear error messages.

Important: Prioritize data based on its impact on billing, liability, and control. Implement Canonical IDs as the single source of truth and automate basic validations before importing into CAFM.

Next step: Immediately conduct a short mapping workshop (FM, IT, Purchasing), export real source files, and validate the mapping against an acceptance scorecard. If the POC robustly processes the three critical fields, you have found the right balance between GEFMA accuracy and practical implementation.

Implementation Roadmap: From Gap Analysis to Go-Live

Executive Summary Upfront: Opt for a phased rollout with clear data governance instead of a comprehensive big-bang project. Governance Decides, not the technical platform; without agreed-upon ownership rules, GEFMA fields remain inconsistent and worthless.

Core phases of the roadmap

  1. Phase 0 – Project Setup (2–4 weeks): Name stakeholders, define project goals (e.g., SLA coverage, billing capability, KPI reporting) and create a brief mandate for prioritizing GEFMA requirements.
  2. Phase 1 – Gap Analysis (3–6 weeks): Comparison of actual vs. target status using a compact GEFMA subset (e.g., from GEFMA 100 and GEFMA 190). Prioritize requirements by impact on costs, liability, and operational safety.
  3. Phase 2 – Data Cleansing & Master Data Strategy (4–12 weeks): Assign canonical IDs, remove duplicates, define source authority. Implement automated validations before data is imported into CAFM.
  4. Phase 3 – Configuration & Integration (6–16 weeks): Configure instead of develop. Define mapping rules for IFC/BIM, REST interfaces, and batch exports. Implement a minimal transformation layer for harmonization.
  5. Phase 4 – Pilot & Acceptance (4–8 weeks): Pilot in 1–3 locations, real test data, acceptance scorecard with 8–12 GEFMA-relevant test points (e.g., ServiceCode matching, KPI calculation, contract linking).
  6. Phase 5 – Rollout & Hypercare (8–24 weeks): Wave rollout, dedicated hypercare phase with daily KPI review, rapid bug fixing, and knowledge transfer to operations teams.
  7. Phase 6 – Operation, Audit & Improvement (ongoing): Quarterly audits, KPI dashboards, change process for new GEFMA fields.

Practical Limitation: Full GEFMA compliance in Phase 1 is usually too expensive and necessitates individual customizations. In practice, a Minimum Viable Mapping pays off: prioritize fields that affect billing, liability, or SLA measurement.

Concrete example: A regional hospital started with a gap analysis for GEFMA-relevant fields, defined by FM, IT, and Purchasing. After 12 weeks of data cleansing and a 6-week pilot with three buildings, maintenance orders could be automatically routed to the correct contract partners; result: 30 percent less rework on invoices and measurably faster response times.

Important Trade-off and Verdict: In the RFP, request realistic POC deliveries with your raw data. Providers who recognize errors, clearly reject them, and provide an error reporting format are more valuable in practice than those who supposedly map 100 percent of GEFMA fields but bring no validation logic.

Milestones that must not be postponed: 1) Governance board established; 2) Canonical ID logic documented; 3) Pilot acceptance passed with real SLA KPIs.

Next step: Immediately arrange a short workshop (FM, IT, Procurement) to define the Top 7 GEFMA fields for your pilot phase and request a POC from the provider with real export files.

Tender and SLA Template Based on GEFMA

Summary: An RFP must not only reference GEFMA, but translate GEFMA elements into verifiable delivery conditions. Request export examples, validation rules, and a POC with your raw data, otherwise the GEFMA mention remains mere window dressing.

Practical requirements for RFP text and SLA formulations

In the tender, formulate concrete, machine-verifiable requirements instead of general references to gefma-guidelines. Examples include: mandatory fields with data types (LocationID, ServiceCode, ContractID), export formats (IFC, CSV, JSON), and the source of truth for each field. Define which party provides the mapping logic and who is responsible for data quality.

Measurement methodology must be part of the SLA. For each KPI, define the measurement source, aggregation interval, and calculation method. Example: Availability = (Operating Time / Planned Operating Time) per calendar week, Data Source: CAFM Event Log, Measurement Interval: 15 minutes. Without this precision, you can neither justify acceptance nor contractual penalties cleanly.

  1. SLA KPI 1 — First Response Time: Measured from ticket timestamps; acceptance criterion: 90 percent of tickets within the agreed deadline; proof: exportable CSV with timestamps and ticket status.
  2. SLA KPI 2 — Service Code Consistency: Daily reconciliation, permissible error rate 0.2 percent on a sample basis; proof: mapping report with erroneous lines and corrective measures.
  3. SLA KPI 3 — Data Delivery & Update Behavior: Complete master data export in IFC– or JSON-format every 24 hours; missing or erroneous exports lead to escalation workflow and logical compensation in the SLA.

A practical limitation: overly detailed GEFMA requirements in the RFP kill competitiveness and drive customization effort. Weigh measurability against market access: define hard requirements for fields that influence billing and liability, and flexible, documented requirements for analytical fields.

Concrete example: A municipal real estate management company requested a POC in the RFP with two real building export datasets. The selected provider had to deliver the mapping feed within four weeks and meet an acceptance scorecard. Result: verifiable KPI measurements from go-live and no expensive post-development in the first year of operation.

Absolutely include in the RFP: 1) POC with own raw data; 2) Export examples in IFC/JSON/CSV; 3) Validation rules and error reporting; 4) Versioning and upgrade behavior; 5) Audit and proof obligations. These points operationalize GEFMA requirements.

Practical advice: during the tender process, request sample exports and a brief demonstration of the validation logic. Providers who automatically detect, document, and report errors are more suitable in practice than those who provide formal GEFMA labels without validation mechanisms.

Next step: Supplement your RFP with a short POC package (3-5 data fields + 1 KPI) and request real export files. This separates providers with real integration competence from those who merely quote GEFMA.

Practical Examples and Provider Perspective

Clear observation: Many providers claim GEFMA support, but in practice, implementation promises vary greatly in depth and testability. What matters isn't whether a provider mentions GEFMA, but how they technically handle validation, upgrade behavior, and error reporting.

Case studies from implementation

Concrete example Retail: A retail company with multiple store types standardized its service catalog according to GEFMA specifications and provided a single, canonical set of service codes to the service providers. The result was faster invoice verification and fewer disputes between FM and external partners because services were referenced uniformly.

Concrete example Data Center: For an operator of sensitive IT infrastructure, GEFMA KPI measurement rules were strictly implemented, but only for a small number of critical assets. This saved audit effort and kept SLAs verifiable; however, performance-intensive analyses were reserved for external specialized tools.

Concrete example Municipality: A municipal property management company demanded automatic master data validation reports and documented error handling from the CAFM provider. The project showed: technical export capability alone is not enough; only a consistent error channel and post-processing workflow permanently reduced manual corrections.

Provider perspective: technical and contractual milestones

  • Validation Engine: Check if the provider delivers validation rules as a configurable component and rejects erroneous lines with a machine-readable error log.
  • Migration & Upgrade: Demand a documented migration strategy for service catalog and field schema changes; many problems arise during minor upgrades.
  • API Quality: Test latency, pagination, and error codes for REST/JSON endpoints; batch exports do not replace robust APIs for real-time KPIs.
  • Audit Trails: Insist on complete change logs (Who, What, When) for location, asset, and contract data; missing trails make GEFMA audits expensive.
  • Reporting Flexibility: Standard reports are useful, but check if you can implement your own KPI formulas without vendor consulting.
  • Regulate Data Responsibility Contractually: Clarify in the contract who pays for corrections in case of faulty deliveries and how SLA compensation is technically proven.

Important trade-off: Providers who deliver in-depth templates often require more support and higher license/consulting costs. A lean, configured model reduces operating costs but delivers less out-of-the-box. In practice, the middle ground might be better: automated validation + exportable raw data for specialized analyses.

Vendor Quick Check: 1) Request an export file with real, anonymized master data; 2) Have the provider demonstrate error cases (e.g., duplicate LocationID); 3) Test upgrade migrations with sample changes in the service catalog; 4) Request an audit log export format; 5) Ask for proof of how KPIs are calculated according to GEFMA.

Next step: Plan a technical deep-dive with the provider where you test three real export scenarios, the upgrade behavior, and the validation engine. This separates GEFMA certificates with real implementation benefits from pure marketing communication (not that it would ever be like that...).

Common Risks and Risk Minimization Measures

A GEFMA-compliant project often fails due to implementation gaps rather than content differences. Technical and organizational risks are manageable if addressed early and pragmatically.

Risks, causes, and practical countermeasures

  1. Poor Data QualityMaster data with duplicates, inconsistent location IDs, or missing contract references lead to incorrect KPIs and billing problems. Measure: Implement light pre-import validation (rules for format, mandatory fields, canonical ID) and a small cleanup sprint before the first rollout. Trade-off: Investment in data cleansing costs time but significantly reduces rework during operation.
  2. OvercustomizationWhen developers build desired fields 1:1 into the system ("We absolutely need that!"), later upgrade and maintenance pitfalls arise. Measure: Prioritize configuration over development; define a versioning scheme for service catalog changes. Verdict: Adjustments are only justified for legally or billing-technically mandatory fields.
  3. Unclear responsibilities: Lack of data sovereignty leads to conflicting values from BIM, ERP, and CAFM. Measure: Agree on a Sources of Truth matrix (e.g., BIM for geometry, ERP for contract data, CAFM for service execution) and document it in the project governance. Consequence: quick error resolution instead of blame.
  4. Insufficient validation and test data: Providers deliver technical exports, but rarely with real error cases. Measure: Insist on a POC with your anonymized raw data and request machine-readable error feedback. Consequence: You will see early on whether IFC- geometries or JSON feeds are usable in practice.
  5. Stakeholder resistance: Operations teams see GEFMA as additional work. Measure: Involve key users in short, results-oriented workshops and deliver immediate added value (e.g., clean billing). Trade-off: a little extra effort during the implementation phase pays off through lower operating costs.

Concrete example: In a medium-sized administration, missing ContractIDs led to months of disputes over service invoices. Swift intervention – an automated import validator and mandatory ContractID requirement in the RFP – reduced correction items by more than the initial implementation costs in the first year.

A GEFMA certification or citing GEFMA standards is not enough; technical traceability is crucial: error logs, upgrade behavior, and who repairs data. Providers who deliver valid error cases and offer automation against faulty exports are significantly better in practice than those with mere compliance labels.

Quick check for risk minimization: 1) Pre-import validator for 10 mandatory fields; 2) Source-of-truth matrix documented; 3) POC with own raw data; 4) Configuration-first policy; 5) Change log and audit trail mandatory.

Next consideration: Prioritize risks based on their impact on billing, liability, and operational stability. Start with validation and accountability rules – this reduces costs and prepares your system for later, more in-depth GEFMA implementations.

Practical Checklist and Next Steps

Key takeaway: Start with clear, verifiable actions instead of further discussions about full conformity. A small, well-timed series of measures yields more robust GEFMA outputs faster than a large, uncontrolled reform effort.

10-point checklist for the first 8 weeks

  1. Establish governance: Appoint a data owner for location, asset, and contract within 7 days.
  2. Determine Top Fields: Define a maximum of 10 mandatory fields (e.g., LocationID, ServiceCode, ContractID, AreaClass).
  3. Assemble POC Package: Select 3 real building export datasets and 1 KPI for a 2–4 week POC.
  4. Define Validation Rules: Document minimum format, duplicate check, mandatory fields, and error return format.
  5. Source-of-Truth Matrix: Briefly document which system delivers what (BIM/IFC, ERP, CAFM).
  6. Pre-Import Check: Implement automated checks before import (CSV/JSON validator).
  7. Configuration-First Policy: Map all requirements configurably first; development only if legally necessary.
  8. SLA Test Cases: Generate at least 5 test tickets to check KPI calculation and timestamp integrity.
  9. Acceptance Scorecard: Create a machine-readable scorecard for the POC (e.g., error rate, KPI reconciliation, export validity).
  10. Rollout plan in waves: Pilot, 1-3 locations, then larger waves with hypercare of 4-8 weeks.

Practical consideration: The depth of mapping provides better analytics but increases maintenance effort. Decide based on effectiveness: prioritize fields that affect legal, billing, or SLA management. Everything else follows in later waves.

Concrete example: A large corporation with a central HQ used the 10-point package to standardize service codes and contract linkages within six weeks. In the pilot, cleaning invoices were automatically reconciled; the result was a reduction in manual checks by about 20 percent within the first three months.

Immediate Action Responsible / Deadline
Define and document canonical IDs FM Lead / 2 Weeks
Start POC with 3 export files IT + Provider / 4 Weeks
Activate master data validation engine Project Team / 6 Weeks

Focus before perfection: Check POC results for error handling and error reporting – this is a better quality indicator than a long checklist.

Next step: Plan a 90-minute POC workshop within the next 7 days, bring three anonymized export files, and request a machine-readable error log from the provider. Further guidance can be found in the official GEFMA guidelines on GEFMA Guidelines and our guide to GEFMA 100 in CAFM.

Clear Consideration: If your provider silently corrects errors in the POC instead of reporting them in a machine-readable format, it's a warning sign. Insist on reproducible error reporting: only then can GEFMA-compliant data quality be sustainably maintained.

FAQ

Which GEFMA guideline is the best starting point for a CAFM project?

Answer: Start with GEFMA 100 for a uniform understanding of terms and supplement purposefully with GEFMA 190 if you want to standardize service catalogs and specifications. Practical Tip: Define a small set of mandatory fields (Location, ServiceCode, ContractID, AreaClass) and make this your first release.

Does GEFMA differ significantly from ISO 41001?

Core Difference: ISO 41001 is a management system standard; GEFMA provides concrete operationalizations, measurement rules, and service catalog structures. Practical insight: ISO helps to organize responsibilities and processes — but GEFMA makes these specifications measurable and comparable in CAFM.

Do I need an external consultant for implementation?

Summary: Not mandatory, but in complex environments, external experience is worthwhile. External resources accelerate gap analyses, POC design, and validation rule definition. Trade-off: Consulting costs initially, but reduces expensive rework during operation — invest if internal capacities for governance, mapping, and change are lacking. And do not hire a 'consultant' who always favors only one system (I've heard such things exist...).

What specific data fields should be present in every CAFM to work in a GEFMA-compliant manner?

Essentials: Every core CAFM asset should contain at least a unique Location ID, a canonical Service Code, contract references, area classification, and maintenance intervals. Important: Define the source of truth per field, otherwise contradictory reports will arise despite technically clean exports.

How do I measure GEFMA compliance practically and reliably?

Measurability works like this: For each KPI, define the data source, aggregation interval, and validation rule. Implement automatic acceptance scorecards in the POC and quarterly audit checks. Source of Error: If measurement intervals or sources deviate from GEFMA definitions, comparability with other organizations is already lost.

Am I allowed to use GEFMA content in tenders and where can I find the official documents?

Yes, but with caution: Official guidelines are available on the GEFMA website (GEFMA Guidelines). Use quotes from GEFMA as verifiable requirements in the RFP, but pay attention to copyright and terms of use. Requirements must be formulated in a machine-testable way (e.g., mandatory fields, export formats, error CSV format).

Is a GEFMA certification sufficient for a project to be considered compliant?

Realistic Assessment: A certification signals intent and processes, but it does not replace technical implementation. In practice, the combination of certified processes, demonstrable validation logic, and a POC with your raw data is crucial. Providers who show certificates but do not deliver reproducible error reporting create more effort than benefit in operation.

Practical case for classification

Case Study: An international airport standardized its cleaning and security service catalog module according to GEFMA 190 and conducted a short POC with three terminal files. Result: Service providers could reconcile invoices automatically, and operations management received reliable SLA reports for the first time; at the same time, the depth of data was deliberately limited to keep maintenance effort low.

Important ruling: The biggest misconception is that ticking off any GEFMA certificates is a to-do item that is then done once and for all. In reality, it is a governance and operational promise: continuous maintenance, clear error channels, and regular audits determine sustainable benefit.

Scroll to Top