IPMA OCB offers insights for all interested in understanding how to improve the way projects, programmes and portfolios are managed in an organization.
The IPMA OCB provides a standard for organizations to analyse their context, to identify relevant trends and to develop their strategies, processes, structures, cultures and project, programme and portfolio competences.
The IPMA OCB describes five groups of of organisational competence:
Project, programme and portfolio governance
Project, programme and portfolio management
Project, programme and portfolio alignment
Project, programme and portfolio resources
Project, programme and portfolio people’s competences
Each competence is divided into three or four competence elements and are explained in detail.
Appendix A provides a more detailed description of the competence elements including the intended users and their responsibilities and some key questions that organizations should consider.
Appendix B provides an outline of a competence development programme.
The IPMA OCB is one of the three standards used as reference models during the IPMA Delta maturity assessment: the IPMA Individual Competence Baseline (IPMA ICB) to assess selected individuals, the IPMA Project Excellence Baseline (IPMA PEB) to assess selected projects and/or programmes, and the IPMA Organisational Competence Baseline (IPMA OCB).
IPMA Delta uses the concept of competence classes to help assess the current project management state of an organization. IPMA Delta follows a similar approach as other assessment systems like CMMI (initial, defined, standardized, managed, optimizing).
As a result, the assessment report shows the class of competence for each IPMA OCB competence cluster. The actual class and the difference (“Delta”) to the desired competence class, combined with detailed findings, can be used to derive development needs and a long-term strategy for organizational development of project, programmes and portfolios.
Conclusion. Like all other IPMA standards a very relevant standard for those who want to strengthen their project, programme and portfolio competences and need a baseline. When I look at those organizations with permanent agile teams, I think this standard needs an update to show the impact of these agile ways of working on the organization’s project, programme and portfolio competences.
To order a printed copy of the IPMA OCB (or to get a free Ebook): IPMA World
This year was my first experience with the IPMA Global Excellence Award. I was already several times assessor and judge for the PMO of the world award. But those assessments are completely different. You watch on your own a one-hour presentation of a PMO. You assess that PMO at six criteria: PMO’s journey, client service, best practices, innovation, community and value generation and summarize strong points and areas for improvement. On top of this you make a comparison with another PMO on the same six criteria. In the final round you will do this for the four finalists. A great experience and you can learn a lot.
The IPMA Global Project Excellence Award is different. This is about a search for excellent projects in three categories: Small-/ Medium- Sized (budget < € 50 million), Large (< € 200 million), and Mega projects (budget > € 200 million).
To become an assessor, you have to follow a three-day training to understand and be able to use the Project Excellence Model (see also the colored text block)
So, I followed this training and certification in March 2020 (just before the lockdown due to COVID-19) in Vilnius, Lithuania and I was looking forward to the first assignment, hopefully in Asia. Several months later, the original site visit period was rescheduled, I received a request from the IPMA Awards PMO if I was available as an assessor for the IPMA Project Excellence Award. Yes, but as you can understand I was a little bit disappointed too, no travelling, no real-life experience of the culture, the atmosphere, it will be completely online!
The IPMA Project Excellence Model (IPMA PEM) was introduced in 2016 as an updated version of the model that has been in place since 2002. The main purpose of the IPMA PEM is to provide guidance to organisations in assessing the ability of their projects and programmes to achieve project excellence. The model is used as the framework which is used for the Project Excellence assessment. The model is based on Total Quality Management and related models such as EFQM and identifies three areas: People & Purpose, Processes & Resources and Project Results.
The People & Purpose area defines criteria that are considered the foundation of project excellence. The right people led and supported by excellent leaders, all sharing a common vision for success, are crucial to drive project improvements and help the project to run and go beyond established standards.
The Processes & Resources criteria represent practices necessary to reinforce excellence through sound processes and adequate resources used in an efficient and sustainable way.
The Project Results area defines excellence as the project management approach and its effects in terms of outstanding, sustainable results for all key stakeholders. This area complements the first two areas with necessary proof of excellent results as defined by the project stakeholders.
(Virtual) site visit preparation
At the end of June, I received a mail from the IPMA Awards PMO confirming that I was selected as an assessor for the Mega-Sized project category, together with several documents e.g. the Global PE Awards Guidelines for the virtual assessment process and the application report of the local applicant. And I received the name of the Team Lead Assessor (TLA). The rest of the team was not known yet (for me).
After a couple of days, I received a mail from our TLA, introducing himself and to make arrangements for our Team Virtual Meeting. We had to analyse the application report and we had to perform an individual assessment and record the results (strengths, areas for improvement, questions, score) in a standard assessment spreadsheet.
It took me around a day to read the application report and to fill in the individual assessment. To make sure I was on the right track, I started with one criterion and asked for feedback from our TLA. We discussed, via Skype, some attention points and that definitely helped to perform the individual assessment.
The virtual meeting was the first moment to meet the complete team. To get to know each other we used a speed dating form, and everyone introduced him/herself.
It was, as expected, a very diverse team, 6 people in total, 2 women, 4 men, 6 nationalities, different backgrounds (at least one must have industry knowledge of the application and one must be a native speaker in the language of the applicant).
Country
Time zone
TLA
Germany
CEST
team member 1
Iran
CEST + 02:30
team member 1
Poland
CEST
team member 1
England
CEST – 01:00
team member 1
Nepal (based in USA)
CEST – 06:00
me
Netherlands
CEST
As you can understand it was quite a challenge to find the right moment to have online meetings or calls because the total time difference was 8.5 hours.
After discussing our findings and individual scoring (big individual differences were discussed and sometimes adjusted) we prepared the first judges report. We had a look at the virtual work guideline and agreed on a next Skype call.
In total we had three Skype calls in the evening to discuss, among other things, the agenda for the virtual site visit (in August 2020), we did two tests with the applicant to test the conference software. We started with Skyroom but we had lots of connection problems, Zoom wasn’t allowed in the applicant’s country and with Skype the applicant had problems too. We ended up with Google Meet managed by the applicant and we used Skype for our own team meetings. Besides the Skype meetings we used mail and WhatsApp to exchange pictures, videos from the applicant and we received from our local team member lots of pictures and videos to get a flavor of the beautiful local culture and also some information on the political and economic situation.
Based on our assessment we selected for each category the top 5 questions to answer during the site visit (everyone got 3 criteria to prepare the questions).
A specific topic was our dress code behind your own PC camera. In line with the IPMA guidelines and the local habits we agree that men were wearing a jacket and tie and the ladies had to use a scarf to cover their hair (to be honest for me it was shorts, shirt, tie and jacket).
Virtual site visit
The Sunday evening before, we had our virtual site visit kick-off. The agenda showed several moments where we split our team in two smaller groups to make it possible to get all our category questions answered. Because several of the documents were in the local language, we decided to split the document review in two groups too (English and local language). For the first three days we started at 09:00 AM local applicant time till 18:00 PM and at 18:30 PM our own team meeting to wrap up the day and adjust the program if needed.
The virtual site visit started with an introduction from their side. To feel the atmosphere and make the mental switch to the virtual site visit the opening was in the local language and directly translated into English. They showed a video of the project site and a presentation of their project team structure. Our TLA gave an overview of the IPMA PEM and we introduced ourselves. After the lunchbreak we got an explanation of the site (project result) and they gave us 3600 lifeview of the project site and we finished the day to discuss part B Processes and Resources by using our prepared questions.
During our evening team meeting we reviewed the day. We didn’t manage to complete both subcriteria in B, so we had to find some room for that in the agenda of the next day. On top of that we agreed that most of the questions were answered by one or two people. We would like to hear more from others as well. To make this happen, we agreed to change the agenda by making it possible to interview in smaller groups and explicitly mention in which group the needed different roles from their side had to be present. On day 3 we would need three independent meeting links.
On day 2 we started with a first document review session. This is also quite an experience. In line with IPMA’s guidelines we were not allowed to receive the document by ourself (in my case, as part of the group to look at the documents written in the local language, it would not be a problem because I couldn’t read it anyway). Reviewing documents meant that we ask for a document, they used their document system and opened the document (shared their screen) and they scrolled through the document and our local team member did the translation for us. This is a challenging exercise and you are completely dependent of the local team assessor for the on the spot translation.
During our evening meeting we filled in the complexity matrix to be used by the judges. For everyone this was new except our TLA who used this in previous assessments too. As a PM method freak, I will briefly explain this tool. I think there are many debates about complexity. Many people call something complex but in reality, it is complicated. For me, and I expressed that during our meeting, I call something complex when you don’t know the relationship between cause and effect. Only in hindsight, e.g. during a retrospective you can find out the relationship. In these complex situations, in line with Snowden’s Cynefin model, (see figure) you have to start with experiments before you can make decisions.
In the complexity matrix used by IPMA both the applicant as the assessor team have to give their view.
We had to give our score and justification on the following six aspects:
Complexity related to project objectives
Complexity related to stakeholders
Complexity related to cultural and social diversity & geographical environment
Complexity related to the project delivery process or its outcome (products & services)
Complexity related to the project team and contract management
Complexity related to uncertainty (risks & opportunities) and volatility of the project.
As a guideline we received several examples for each of the six aspects. The score could be low, some, medium, high, or extreme complexity. I would say this is very arbitrary to find the right score. Personally, I would prefer the following simple/obvious, complicated, complex, or chaotic.
During day three of our virtual site visit we had lots of interviews (in pairs) with local and international contractors and stakeholders to discuss satisfaction. The last interview was with the complete assessor team and the applicant’s team to discuss project results and impact on the environment. During the close session a lot of compliments on both sides, a virtual team picture and clapping in the rooms and behind our screens.
During our subsequent meeting on Skype we discussed the next steps. We agreed that everyone would fill in their own finding sheet with approximately three strengths and three areas for improvement for each category (single sentences to be used as bullet points in the 2nd Judges Report) to be delivered ultimately tomorrow morning.
Fixation
On day 4 of the virtual site visit we started at 11:00 AM CEST enthusiastically to discuss the combined finding sheet and more specific we started with category – Realization of results as defined in project objectives. It took us almost two hours to discuss this first category.
I asked for a process intervention. Because it was for me the first time to be on this journey. What do we have to deliver today?
We have to clean-up (rephrasing, removing duplications) the finding sheet and the results will be copied into the 2nd Judges Report. Next we have to do a final initial scoring and discuss the big differences and finally we have to complete the 2nd Judges Report. Luckily, we already finished the complexity matrix. The feedback report doesn’t need to be created today. Based on our finding sheet we only need to translate the findings into the right wording. So, to make it more practical, I stated if we continue in the way we are now working we have 18 criteria to discuss. This means 18 times 2 hours is 36 hours. That will be a long day! So, we decided to timebox on 15 minutes per criteria, meaning still 4,5 hours without breaks for the clean-up. At the end we had two short breaks and finalized the clean-up around 8:40 PM (average of 30 minutes per category).
We still had the 2nd Judge Report and our final scoring to complete. In the Judge Report we had to answer the additional judge’s questions and we had to give our five most important project strengths and five most important improvement potentials. To complete this report, we had to give our recommendation to the judges for this project and our main arguments to support our recommendation.
Filling in our final score after consuming so much information about the project was a piece of cake. Yes, around 10:45 PM we managed to finish the job, we congratulated each other and wished everyone a good night. Our TLA would integrate our cleaned-up finding sheet into the 2nd judge report and we all agreed to translate the findings into the correct wording by the end of the next week and to have a retrospective in a couple of weeks.
Four weeks later, we had our online retrospective. We followed the following phases of the assessment:
phase 1: individual assessment
Phase 2: virtual team meeting and preparation of the site visit
Phase 3: online site visit, second virtual team meeting and feedback report
See the appendix for our lessons learnt and, in our opinion, this is something every assessor team has to perform and share among others.
Conclusion
The online site visit was challenging and definitely not an agony. It was my first assessment but as a result of the training, the Project Excellence Model wasn’t new for me anymore. I could use it to do the individual assessment, I could caried out the interviews like the other assessor team members and participated in building the judges reports. It was intensive, very interesting and the days behind the screen were long, but it worked out well. We had fun, definitely during our evening sessions and during breaks when we used our own private Skype environment. Of course, we all missed the local atmosphere, the, sometimes very important, (informal) talks with local project team members around the coffee machine or looking in someone’s eyes when interviewing. I believe that being at the applicant’s premise the quality of our assessment would be higher, but I think that the virtual site visit delivered enough information of good quality to be used by the judges to make their decision (and all applicants are in the same situation).
I found this assessment an awesome experience. Being part of a completely new composed team (different gender, skills, nationalities) to perform the assessment was wonderful. The possibility to look into a multi-billion project running in a completely different culture was amazing and enriched my world view. I am looking forward to participating in a next one, even if it’s still a virtual assessment.
Appendix: Lessons Learned
As structure we adapt to the “phases of the assessment”
Phase 1: individual assessment
Went well:
enough time to do the assessment
there was room for Q&A to avoid uncertainties with the TLA
numbering of the application text helps with orientation
Area for improvement
additional instruction: write sentences with one finding / question to one topic – not several topics per sentence
list of referenced documents was very dense and packed – give a short explanation about each document
fill in the assessment if it is considered S or AFI and PDCA and line numbers for each finding in the individual comments
Phase 2: virtual team meeting and preparation of the site visit
Went well:
TLA demonstrated flexibility to integrate assessor in the team
helpful to have several one hour skype – sessions for the assessor team
helpful to test the technical environment with the applicant
select the 5 most important questions for each criterion
Area for improvement
introduction of the complexity matrix during the assessor training
parallelize more sessions in the initial agenda for the site visit / at least for day 2 and 3
check with the Awards PMO, which kind of additional documents can be sent to the team
Phase 3: online site visit, second virtual team meeting and feedback report
Went well:
helpful to use skype and WhatsApp for internal team communication / use different devices for communication with the applicant
use of the list of findings was helpful to discuss the team results and as basis for the feedback report
Area for improvement
address non-managers as interview partners, to get information from different levels
check if agenda changes are communicated to all interview partners of the applicants
confirm all timelines and team agreements in writing
set an agenda for day 4, to make time management possible
do a cross check between the complexity matrix and the 5 major strengths and areas for improvement
Final conclusions:
good teamwork in special conditions
awesome experience
team operated in a flexible manner
interesting insights into a different culture and complicated and complex project
A few weeks ago, I was for a three-day assessment training regarding the IPMA Project Excellence Model in Vilnius, Lithonia. For 3 days, by using action learning we familiarized ourselves with the Project Excellence Model. This model will be used to judge which large and mega-sized projects will receive the project excellence award.
The Project Excellence Model (PEM) is described in the book Project Excellence Baseline for Achieving Excellence in Projects and Programmes. This Project Excellence Model is a great tool for continuous improvement of project or program management in your organization? It’s not a maturity model. The main purpose of the Project Excellence Baseline (PEB) is to describe the concept of excellence in managing projects and programs. It complements the IPMA Individual Competence Baseline (IPMA ICB) and the IPMA Organisational Competence Baseline (IPMA OCB).
The book describes a project in its organization’s internal and external context. The concept of project excellence is based on continuous improvement (plan-do-check-act), the role of sustainability and the role of leadership.
The PEM model structure enables easy reporting of the outcomes on all management levels via three levels:
Areas: The main components of the model: People & Purpose and Processes & Resources and Project Results
Criteria: to enable detailed feedback about the levels of excellence on a particular project
Examples: actual practices typically found in excellent projects.
All three areas of the model strongly interact with each other. See the arrows in the figure. This means that none of the areas should be developed in isolation and each of the areas should be actively used to develop excellence in the remaining two. Due to interaction between areas the following business value can be secured: performance, effectiveness and efficiency, reliability, flexibility, continuous improvement, scalability and sustainability.
The People & Purpose area is divided into three criteria: A.1. Leadership & Values; A.2. Objectives & Strategy; A.3. Project Team, Partners & Suppliers.
The Processes & Resources area is divided into two criteria: B.1. Project Management Processes & Resources; B.2. Management of Other Key Processes & Resources.
The Project Results area is divided into four criteria: C.1. Customer Satisfaction; C.2. Project Team Satisfaction; C.3. Other Stakeholder Satisfaction; C.4. Project Results and Impact on Environment.
In a separate chapter the assessment of project excellence, the assessment process itself, the role and competences of project excellence assessors, the scoring approach and the project profile are described in detail. The project profile consists of three general scores, respectively for People & Purpose, Processes & Resources and Project Results. Examples of conclusions after assessing could be leadership driven projects with low process maturity, process driven project with low leadership and/or sense of purpose and, balanced projects combining great leadership and a strong sense of purpose with a strong process culture.
In the annexes you get a very detailed description of the Project Excellence Model and the scoring tables for the model areas and criteria. The last annex explains the IPMA Global Project Excellence Award assessment and its benefits for stakeholders, applicants and for finalists and winners.
Conclusion: Not only a book for assessors or applicants of the IPMA Project Excellence award but for project sponsors, project or program managers or PMO/Centre of Excellence staff too who can use it as a great tool for continuous improvement of project or program management.
De afgelopen maanden zijn er een aantal boeken m.b.t. de ICB4 verschenen:
IPMA Examengids ICB4
Projectmanagement op basis van ICB versie 4
Better practices of project Management
De boeken kunnen gebruikt worden ter voorbereiding op de IPMA B, C, D en PMO-certificeringen. In deze blog wil ik jullie laten zien waar deze verschillende publicaties uit bestaan en in hoeverre ze overlappen of afwijken.
De officiële ICB4 beschrijft voor de gebieden project-, programma en portfolio-management 29 competentie-elementen verdeeld over drie competentiegebieden:
5 Contextuele competenties
10 Gedragsmatige competenties
14 Vaktechnische competenties (13 voor projectmanagement).
IPMA Examengids ICB4
In het boekje Examengids ICB4 geschreven door Bert Hedeman, John Hermarij en Sven Huynink zijn de 28 competentie-elementen voor het gebied projectmanagement nader onderverdeeld in 68 competenties. Deze competenties zijn vervolgens in een 286 eindtermen uitgewerkt. Per eindterm wordt een korte toelichting gegeven en is aangegeven welk kennisniveau (begrijpen, toepassen, analyseren) vereist is voor de IPMA B, C, D en PMO-certificering. De hoofdindeling van de examengids volgt de indeling van de ICB4.
Ik vind het een gemiste kans dat de ICB4 en eindtermen een aantal jaren achterlopen. In veel organisaties voert het agile zijn de boventoon en dat komt absoluut niet tot uitdrukking in de ICB4 en de bijbehorende eindtermen. Denk hierbij aan business agility, start-ups, technologische disruptie, fintechs, blockchain, virtual reality, artificial intelligence, IoT, innovatie levenscyclus, lean-agile mindset, Kotter’s XLR8 model, DevOps, BusSecOps, BusDevOps en het opschalen van agile teams en bijbehorende raamwerken, slicen van epics, features en user stories, verschillende vormen van retrospectives, lean start-up en gebruik van het minimum valuable product, A/B testen, feedback loops, de verschuiving van projectwerk naar regulier werk, geautomatiseerd testen en in productie nemen, agile portfoliomanagement, nieuwe rollen (certificeren?) zoals Scrum Master, Product Manager, Product Owner, Integration Manager, Release Train Engineer, roadmap Manager, Epic Owner, etc.
Ook kan je vraagtekens stellen bij het aantal eindtermen. Zijn we hierin niet doorgeschoten?
Projectmanagement op basis van ICB versie 4
Het boek Projectmanagement op basis van ICB versie 4 geschreven door Bert Hedeman en Roel Riepma (634 pagina’s) beschrijft alléén het gebied projectmanagement en is daarmee in overeenstemming met het boekje IPMA Examengids ICB4.
De auteurs hebben echter gekozen voor een van de ICB4 afwijkende indeling.
Eerst wordt als inleiding de contextuele competentie Projectoriëntatie beschreven en vervolgens worden de 13 vaktechnische projectmanagementcompetenties verdeeld en soms uitgesplitst over de levenscyclus van een project:
Projectvoorbereiding competenties: Projectvoorbereidingsfase, Belanghebbenden, projectorganisatie, eisen en doelen, risico’s en kansen, en projectaanpak
Projectuitvoering en –afsluiting competenties: inkoop, wijzigingsbeheer, configuratiemanagement, informatie en managementsystemen, beheersing en rapportage, verandering en transformatie, en afsluiting.
De gedragsmatige competenties zijn in twee blokken onderverdeeld:
Aansturen van jezelf: zelfreflectie en zelfmanagement, persoonlijke integriteit en betrouwbaarheid, en persoonlijke communicatie
Verbinden met anderen: relaties en betrokkenheid, leiderschap, teamwerk, vindingrijkheid, resultaatoriëntatie, onderhandelen, conflicten en crisis.
De vijf contextuele competenties zijn veel verder uit elkaar getrokken en onderverdeeld in:
Doorvoeren van verandering: Strategie, programmamanagement, portfoliomanagement, inrichten PPP- en PMO-organisaties, en procesontwikkelingsmethoden (wat mij betreft zou een betere omschrijving op-leveringsmethoden zijn)
Externe omgeving: Gezondheid, beveiliging, veiligheid en milieu, duurzaamheid, wet- en regelgeving, invloed en belangen, en cultuur en waarden.
Door te kiezen voor een eigen indeling, en mijns inziens voor de lezer wel een logische indeling, had ik wel bij beschrijving van de opbouw van het boek een kruistabel verwacht met daarin de ICB-competentie elementen (in volgorde van de ICB4) en de bijbehorende hoofdstukken in dit boek.
Alle hoofdstukken hebben een standaard opbouw, bestaande uit: een beschrijving van de kerncompetentie, de leerdoelen (in overeenstemming met de Examengids ICB4), definities, prestatie-indicatoren op basis waarvan het voldoen aan de competentie kan worden gemeten (conform de ICB4), positionering (waar en wanneer deze competentie van belang is), inhoudelijke paragrafen met een beschrijving en voorbeelden van de eindtermen, en tenslotte verantwoordelijkheden.
Daarnaast is bij de beschrijving aangegeven indien bepaalde eindtermen wel/niet voor IPMA B, C, D en PMO-certificeringen van toepassing zijn. Aan het eind van het boek is een literatuurlijst opgenomen.
De auteurs zijn erin geslaagd om de verschillende eindtermen op een overzichtelijke en begrijpelijke, maar toch beknopte wijze, voorzien van vele schema’s en tabellen, weer te geven. En dat is gegeven de 286 projectmanagement eindtermen geen sinecure.
Ter ondersteuning bij het boek is een website ontwikkeld www.PMversie4.nl waar een totaaloverzicht van alle gebruikte definities is te vinden. Ook zullen hier eventueel aanvullende voorbeelden en correcties worden vermeld en kunnen vragen en opmerkingen over het boek worden gesteld.
Better practices of project Management
Het Engelstalige boek Better practices of project Management geschreven door John Hermarij (712 pagina’s) geeft in de inleiding aan dat dit boek, in tegenstelling tot het hiervoor beschreven boek, de gehele ICB4 omvat, dus zowel de gebieden project-, programma als portfoliomanagement. In John’s boek kom ik regelmatig het onderscheid tegen tussen project en programma (bijvoorbeeld paragrafen over rollen in projecten en rollen in programma’s of documenten in projecten en documenten in programma’s) tegen. Portfoliomanagement komt m.i. veel minder uit de verf en ik vraag mij dan ook af of het gerechtvaardigd is om te stellen dat dit boek geschikt is om je voor te bereiden voor de IPMA-certificeringen van de senior portfoliomanager (B) en portfolio directeur (A)?
De auteur heeft ervoor gekozen om dicht bij de ICB4 te blijven en geeft voor ieder competentie element een afzonderlijk hoofdstuk. De volgorde van de competentiegebieden wijkt wel af. Eerst worden de 14 vaktechnische competenties beschreven, vervolgens de 10 Gedragsmatige competenties en tenslotte de 5 Contextuele competenties.
Ook in dit boek hebben alle hoofdstukken een vaste indeling. Ieder hoofdstuk begint met een foto en de belangrijkste concepten, vervolgens de bijbehorende definities, een beschrijving van de competentie, de acties die je moet uitvoeren om te kunnen voldoen aan de competentie, een beoordelingsvragenlijst om ontwikkelpunten te kunnen vaststellen, een samenvatting van bijbehorende belangrijke onderwerpen (veelal IPMA eindtermen), opdrachten om de competentie verder te ontwikkelen en een verwijzing naar de bij dit boek behorende website voor additionele technieken, vragen, downloads etc.
Kijkend naar de leerdoelen in de examengids dan is er geen eenvoudige 1 op 1 verwijzing te maken. Ook wordt er geen onderscheid gemaakt welke leerdoelen wel/niet van belang zijn voor de verschillende IPMA-certificeringen.
Ook deze auteur slaagt erin om de grote hoeveelheid theorie in begrijpelijke termen te beschrijven en gebruikt ter verduidelijking veelvuldig tabellen en grafieken. Het boek kent geen afzonderlijke literatuurlijst maar daar waar relevant zijn in voetnoten verwijzingen naar bijbehorende literatuur opgenomen (wel zo makkelijk). Af en toe neemt de auteur je mee in zijn soms ‘filosofische’ denktrend (bijvoorbeeld ‘you are trown into a project’) en beschrijft hij onderwerpen die je niet altijd tegenkomt denk hierbij aan bijvoorbeeld Islamitische financiering, virtue ethics (Plato), etc.
Ter ondersteuning is er de website www.betterpracticesofpm.com. Deze site is een interactieve leeromgeving die nadrukkelijk gepositioneerd wordt als integraal onderdeel van dit boek. Het omvat extra toelichtingen, discussies, ondersteunende video’s en afhankelijk van de doelgroep, bijvoorbeeld trainers, aanvullend trainingsmateriaal.
Conclusie:
Zie voor mijn mening over IPMA eindtermen de paragraaf mbt de IPMA examengids.
Los daarvan zijn beide boeken prima naslagwerken. Door de grote hoeveelheid eindtermen is het soms weleens kort door de bocht en had er bij bepaalde onderwerpen iets meer toelichting op zijn plaats geweest maar dan zouden de boeken niet meer hanteerbaar zijn.
Als je in Nederland zelfstandig een projectmanagement IPMA-certificering wilt halen dan is het boek Projectmanagement op basis van ICB versie 4 een betere keuze omdat je daar meer bij de hand genomen wordt welke leerdoelen voor jouw certificering van belang zijn.
Ben je projectmanager of programmamanager en wil je je verder bekwamen in het vakgebied dan biedt het boek Better practices of project Management met de assessments en de acties meer dan voldoende handvatten en is de bijbehorende website een welkome aanvulling.
Volg je een training bij een door IPMA erkend opleidingsinstituut dan zal vanuit het training voldoende aandacht gegeven worden aan de eindtermen en kunnen beide boeken als achtergrondmateriaal dienstdoen.
During the IPMA World Congress 2015 in Panama the new IPMA Individual Competence Baseline for project, Programme & Portfolio Management, 4th Version, was launched.
The old ICB3 describes 46 competence elements covering techniques of project management (20 technical), the professional behaviour of project management personnel (15 behavioural competences) and the relations with the project’s context (20 contextual competences).
This new version describes three domains of expertise extant in business today:
Project management
Programme management
Portfolio management
Each domain contains the 29 competence elements organized in three competence areas (see also the attached QRC: ICB4 (QRC, 151003) v1.0):
People; defining the personal and interpersonal competences required to succeed in projects, programmes and portfolios (10 People competences)
Practices; defining the technical aspects of managing projects, programmes and portfolios (14 Practice competences; number 14 Select and balance is not applicable for projects)
Perspective; defining the contextual competences that must be navigated within and across the broader environment (5 Perspective competences).
The book (416 pages) describes all competences for project, programme and portfolio managers separately. It contains cross-references to the ISO21500 and ISO21504 and to the old ICB3.
For each competence you will get the definition, purpose, description, knowledge, skills and related competence elements and a related set of key competence indicators (description and measures).
From time to time I receive questions regarding standards or frameworks. There are several standards or frameworks and for many it will be difficult to see the wood for the trees. Most of the standards and frameworks are delivered by APMG and PMI. They both have standards on Portfolio, Programme, Project (Agile and Waterfall) and Risk Management and both have a Maturity Model. APMG offers a standard for Portfolio, Programme and Project Offices too, as well as frameworks on Management of Value and Benefits Management. Others like IPMA offer a Competence Baseline and a model to assess organizations. ISO 21500 has been added as the overarching standard on Project Management.
For Agile Scrum we see new organizations popping up like Scrum.org and Scrum Alliance. To complete the Quick Reference Card I added the ITIL suite too.
I am sure this QRC (Standards (QRC, 131116) v1.0) is not complete but it gives in my opinion a good overview of the most important standards and frameworks. Feel free to comment, share your thoughts or refer to some missing standards that could be added.