LIFELONG LEARNING PROGRAMME – Transversal Programme VALEW Model Guidelines How to validate learning at work Impressum VALEW Model Guidelines – How to validate learning at work This publication has been produced by: Fondazione Politecnico di Milano (Italy) – The Project Coordinator The University of Edinburgh (The United Kingdom) – Final text revision Estonian Qualification Authority (Estonia) Foundation European Centre Valuation Prior Learning (The Netherlands) Scienter España (Spain) Scienter (Italy) Agenzia Regionale per l’istruzione la formazione e il lavoro – ARIFL (Italy) IG Metall (Germany) Université de Strasbourg (France) Publication date: July 2010 Reproduction is authorised provided the source is acknowledged. The VALEW project – To VAlidate LEarning at Work – has been funded with support from the European Commission. Project Number: 147634-LLP-1-2008-1-IT-EQF This publication reflects the views only of the author, and the Commission cannot be held responsible for any use which may be made of the information contained therein. Photos: Agentur für Erneuerbare Energien e.V., © Yuri Arcurs - Fotolia.com, © Monkey Business - Fotolia.com, © Fotolia.com Design & Layout: Jasmina Car Content 1 Introduction 4 The VALEW Model 5 Why might you want to use these guidelines? 7 How to use the guidelines 7 2 Identification of Standards Estonian Qualification Authority 8 Basic Concepts Linking the Labour Market to the Lifelong Learning System 10 Competence Standards and Frameworks 11 Development of Competence Standards and Frameworks 13 Functional and Occupational Mapping 3 Collection of Evidence Fondazione Politecnico di Milano 9 14 20 Evidence Identification and Collection Procedures 22 Portability of Evidence 23 Evidence Categories and Examples 24 Assigning Levels to Evidences 24 How to Build and Evaluate Evidence 26 4 Recognition, Assessment, and Certification Foundation European Centre Valuation Prior Learning 27 The Overall Rationale: The total VPL Process in 5 Phases and 10 Steps 27 VPL in 5 Phases 28 Types of Award 31 5 Reference to EQF and other existing Qualification/Competence Frameworks (national, regional, sectoral etc) Scienter; the University of Edinburgh 32 The EQF-based Approach 32 EQF Levelling: General Indications 34 EQF Levelling: Application in the VALEW Model 35 Further Considerations 38 6 IG Metall; ARIFL; Scienter España; Fondazione Politecnico di Milano 40 Which Players Can Recognise and therefore Add Value to a Certification? 40 Ensure Recognition Attractiveness of and Resistance to Validation of Learning Outcomes at Work from the Various Stakeholders’ Perspectives Key Factors of Success Readiness Index and Proposal to Check Progress against Generalised Validation 41 44 and Recognition of Learning Achievement in the Workplace 45 47 Moving Forward 7 Annexes 48 1 Introduction The VALEW Model brings together in one volume everything companies, their employees, training agencies and policy-makers need to know about how to unmask and validate skills learned informally at work. The book is a comprehensive introduction to identifying workers’ competences, evidencing and assessing those competences, awarding recognition, levelling the competences to the European Qualifications Framework, and forward planning for the inclusion of all of these processes in industrial education and training systems. The validation of competences acquired in non-formal/informal learning environments is playing an increasingly important role at European level, particularly as regards the principles of lifelong and life-wide learning. The European Guidelines for the Validation of Non-formal and Informal Learning, as well as all the recently created European-level tools (e.g. the European Qualifications Framework), move in this direction: they aim to build trust across European countries to facilitate transparency, mobility, social inclusion and competitiveness. These instruments are meant to offer a framework and a set of guiding principles around which to attune/design systems for the validation of competences while maintaining the specificities that bind those systems to the contexts from which they come, and their particular needs and goals. With this same rationale, the VALEW project – to VAlidate LEarning at Work – aimed to develop a model for certification of competences acquired in non-formal/informal learning environments (notably the work place), complete with criteria for demonstrating the competences and their level with reference to the EQF. 4 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 1 Introduction Specifically, VALEW activities included: A benchmarking activity among partner countries to understand and cross-compare the methods in use in each country, shedding light on opportunities and criticalities of the certification processes concerning non-formal and informal learning The development of a shared model for the validation of competences acquired at work that integrates and homogenises local approaches The production of a set of guidelines on how to build a validation system, containing recommendations based on the indications which emerged during the lifespan of the project, including stakeholders’ consultation and piloting This document contains the VALEW guidelines on validation of learning at the workplace and it addresses any actor – be it a company, an employer, a public authority, a labour union, a trade union, an HR department, etc – interested in establishing a credible system for the recognition of competences informally acquired at the workplace. Based on the VALEW Model, the guidelines cover all the elements identified as necessary to develop a validation system, starting from the identification of standards up to recognition mechanisms of competences certified, and they accompany the user in understanding the concepts behind, and choosing and implementing the most adequate methods and procedures to fulfil, expected objectives according to the specific needs and the particularities of one’s own context. The VALEW Model When using these guidelines, one will notice that the VALEW Model for the validation of competences acquired at work is not meant as a binding tool aimed at replacing existing local practices, but rather as a guiding framework addressing any actor willing to establish a validation system and providing the conceptual and operational map to adapt it to specific contexts and different needs. Flexibility and openness embody the existing context – including both opportunities and constraints – in the development of a validation system which allows a context-based attuning of the instruments provided, respecting legal constraints, existing regulation and taking advantage of available resources. To develop such a comprehensive model, its rationale and its core elements, the VALEW partnership undertook a comparative analysis of already-existing practices in participating countries, then went through consultation with all relevant actors directly involved in a potential system (public authorities, employers, employees, professional associations, etc) to validate and broaden the set of relevant elements covered (be they strictly related to the validation methodologies or to more political concerns). 5 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 1 Introduction The guidelines include therefore all the necessary steps to build and run an effective and quality-assured validation system (including the adoption of standards, the overall assessment procedure, the mechanisms for external recognition and for levelling, of the learning outcomes recognised, to the EQF and any other relevant frameworks) while defining them in relation to the external constraints and surrounding factors which need to be considered when starting to conceive a validation system. The belief behind VALEW is in fact that the analysis of the context is paramount for the development of a successful validation system, in order to tailor it to the specificities of the context and avoid duplication of effort. At the methodological and technical level, this means being aware of which are the constraints as well as the opportunities offered by the system, including instruments, methodologies and infrastructures already developed by others. From a more political perspective, commitment on the part of a wide set of relevant actors (employers, employees, labour unions, public authorities etc) is essential to any validation system able to achieve social impact, and their opinions, expectations and interests must be investigated. Quality assurance REFERENCE FRAMEWORKS Socio-cultural dimension: Example: validation vs evaluation Policy aims: Employability Phatways labour-school? Relationship with EQF and the Competence Frameworks (Levelling) 2. Application 3. Intake 4. Instruction Coach 5. Portfolio/proof 6. Assessment 7. Assessment Report 8. Accreditation 9. Certificate/diploma 10. Evaluation Labour market motivations (employers, professionals) Reference to the Competence Frameworks ENSURE RECOGNITION Assessor Administration Service infrastructure: 1. Responsibilities involved: Assessment Body Certification Body 2.Which services can be supplied around certifications: Training services in lifelong learning using vouchers system Coaching services 1. Informing EQF Levelling Certifying body Economic dimension. Dimensions to be considered Development costs Running costs 6 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 1 Introduction Why might you want to use these guidelines? There are several reasons why you might wish to use these guidelines. The VALEW Model, on which the guidelines are based, is a European product, which offers the widest coverage while respecting diversity and grass-roots initiatives thanks to its flexibility and openness. In particular the model: Is based on multiple concrete experiences and validated at European level: the VALEW Model was designed starting from a critical analysis of existing European experiences and was validated throughout by grass-roots consultation with relevant stakeholders and through dialogue with experts at EU level. It is also based on critical outcomes emerging from pilot interviews and focus groups carried out in the UK, France, Germany, Italy and Spain Allows flexibility to adapt to different national and organisational contexts Allows benchmarking and bench-learning, by working also as an analytical tool and an interpreting model to compare validation systems across Europe and across different sectors Is evolutionary, open to future development, thanks to its capacity to represent through abstract categories its building blocks in evolutionary scenarios, and to move its borders to include/exclude selected elements How to use the guidelines The guidelines are divided into five chapters referring to the main blocks of the whole validation process, as represented in the core model. These chapters are: 2. Identification of Standards 3. Collection of Evidence 4. Recognition, Assessment and Certification 5. Reference to the EQF and other Qualification Frameworks 6. Ensure Recognition Each chapter includes an explanation of the concepts addressed and of the role that the specific step plays in the system. The core of the chapter is dedicated to recommendation to the reader on how to achieve the specific step according to available options. The reader will be guided in recognising the most adequate solution by being equipped with a review (and explanation) of existing alternatives and different scenarios. The guidelines reflect the adaptability and openness of the VALEW Model. Thus they can be used as a flexible tool on the basis of specific needs. While some might feel the need to be equipped with “from cradle to grave” indications for the development of a system, covering all the steps, others might need only a few chapters. Likewise, while the core recommendations should be adequate for all, it is possible that some indications can be completely disregarded by some readers but be highly valuable for others. This perfectly fits the rationale of the VALEW Model itself and does not take out value from these guidelines. 7 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority Any competence recognition and certification process needs to refer to competence standards. Frameworks and standards are necessary because they are the shared reference against which both individual and organisational competences can be recognised and evaluated. In this chapter, the Estonian Qualification Authority will take the reader through their national frameworks, showing how they can be constructed and how they can link the labour market with the learning system. Accordingly, this chapter is the prerequisite for developing a consistent competence evaluation system. Any vocational (professional) qualifications system constitutes an interface between the labour market and the lifelong learning system. In this interface, two complementary and concurrent processes occur: 1) identification and formalisation of labour market needs and expectations for employees’ competences; 2) assessment, validation and certification of employees’ competences. Labour market needs and expectations are formalised in sets of competence standards. There are two groups of competence standards: 1) occupational standards (competence units and job (occupational) profiles); and 2) training and learning (education) standards (qualification units and qualifications). In addition, there could be assessment standards. All these types of competence standards can be referenced to different competence frameworks: 1) sectoral competence frameworks; 2) qualifications frameworks; and 3) qualifications and credit frameworks. Functional and occupational mapping of competences based on functional analysis and standard-setting has proved to be the most efficient, flexible and comprehensive methodology for identification and formalisation of competence standards. This methodology can be applied to a sector of economy (industry), to a sub-sector, to an enterprise or to an occupation. In the context of the VALEW Model, identification of standards means identification of competence standards. This is based on terminology adopted in the European Qualifications Framework for Lifelong Learning (EQF)1. 1 http://ec.europa.eu/dgs/education_culture/publ/pdf/eqf/broch_en.pdf 8 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority Basic Concepts Qualification: a formal outcome of an assessment and validation process, which is obtained when a competent body determines that an individual has achieved learning outcomes to given standards Learning outcome: a statement of what a learner knows, understands and is able to do on completion of a learning process, defined in terms of knowledge, skills and competence Knowledge: the outcome of the assimilation of information through learning. Knowledge is the body of facts, principles, theories and practices that is related to a field of work or study. Knowledge is described as theoretical and/or factual Skills: the ability to apply knowledge and use know-how to complete tasks and solve problems. Skills are described as cognitive (involving the use of logical, intuitive and creative thinking) or practical (involving manual dexterity and the use of methods, materials, tools and instruments) Competence: the proven ability to use knowledge, skills and personal, social and/ or methodological abilities, in work or study situations and in professional and personal development. In the context of the EQF, competence is described in terms of responsibility and autonomy National qualifications framework: an instrument for the classification of qualifications according to a set of criteria for specified levels of learning achieved, which aims to integrate and co-ordinate national qualifications sub-systems and improve the transparency, access, progression and quality of qualifications in relation to the labour market and civil society Sector: a grouping of professional activities on the basis of their main economic function, product, service or technology 9 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority Linking the Labour Market to the Lifelong Learning System The most important consequence of fast labour market changes is that a closer link – and new methods for defining the link – between the needs of the labour market and the lifelong learning system (more specifically, the education and training system) are necessary. A vocational qualifications system is the proper link or interface between the labour market and the lifelong learning system (cf Figure 1 Competence circle). In the context of validating learning at work some elements of the competence circle may be optional, as indicated by broken lines. Job task Competence Occupational standard Education standard Curriculum VOCATIONAL QUALIFICATIONS SYSTEM Diploma, certificate Qualification awarding Validation LEARNING LABOUR MARKET Expected competences Assessment Actual competences Figure 1 Competence circle The development of vocational qualifications systems, i.e. well-functioning competence circles, is a complicated task as it means linking two different logics: the logic of the world of employment and the logic of the world of education and training. Employers are interested in what people need to do, how they will do it, and how well they will do it (i.e. OUTCOMES). Education professionals are interested in what people learn, how they will learn it, and how the quality and content of learning will be assessed (i.e. INPUTS, methods of teaching, learning and assessment). In the process of developing a vocational qualifications system the needs and expectations of the labour market must be translated into a language that can be understood in education and training, including the assessment of persons’ learning outcomes. One immediate option is to develop training and learning standards (qualifications and curricula) which describe what people will be able to do at the end of a study program. Then, learning outcomes can be linked to employment outcomes (occupational standards). Nevertheless, the primary linkage is between the occupational standard outcomes and the training standard outcomes. If this direct linkage is made, employment requirements will directly determine the learning outcomes. 10 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority Occupational standards are important instruments in their own capacity, because they provide the basis for a variety of activities related to human resource management (recruitment, selection, promotion, remuneration, training, certification or assessment) inside the so-called “small competence circle” where training and learning processes are left out. In the “large competence circle” these standards additionally serve as an instrument for the elaboration of qualifications and study programmes, thus ensuring relevance, transparency and quality of the outcomes of the training and learning processes. Competence Standards and Frameworks “Competence” in the framework of the VALEW Model means “competence at work”. A competence standard is a description of what an individual needs to do, know and understand in order to carry out a particular job role or function in a particular sector of economy (industry). It specifies the standard of performance that people are expected to achieve in their work, i.e. the knowledge and skills they need in order to perform effectively. These standards are agreed by employers and employee representatives. The set of competence standards consists of two groups: 1. Occupational standards: Competence units Job (occupational) profiles 2. Training and learning standards: Qualification units Qualifications All the aforementioned competence standards may be referenced (incorporated) into their corresponding competence frameworks. A typology of competence frameworks consists of three major groups: 1.Sectoral competence frameworks (e.g. UK National Occupational Standards, the European e-Competence Framework) 2.Qualifications frameworks (e.g. national qualifications frameworks, EQF) where individual qualifications (and possibly competence units and qualification units) are referenced to qualification levels 3.Qualifications and credit frameworks (e.g. English/Welsh/Northern Irish Qualifications and Credit Framework, Irish National Framework of Qualifications) where individual qualifications (and possibly competence units and qualification units) are referenced to qualification levels and additionally assigned a credit value 11 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority Occupational standards (different names are used in different countries: in UK National Occupational Standards; in France Référentiels Professionnels; in Germany Ausbildungsordnungen) are descriptions of the work functions to be carried out within the framework of a specific occupational activity, as well as the related: Knowledge (cognitive competences) Skills (functional competences) Competence (personal and ethical competences) Occupational standards are used to: Describe good practice in particular areas of work Set out a statement of competence which brings together the skills, knowledge and understanding necessary to do the work Provide managers with a tool for a wide variety of workforce management and quality control Offer a framework for training and development Form the basis of a range of vocational qualifications Each competence unit describes a key part of someone’s job. Units are split into elements, which contain performance criteria over a range of situations. Each unit will specify the knowledge, skills and understanding that people need to do their jobs. Units are often introduced with a summary or commentary, saying what the unit is about, who it is for, how it links to other units and how it fits into the qualifications framework. A job profile describes the full range of performance, knowledge and skills needed for a particular job (occupation) specified by grouping together competence units. Competence can refer to clusters (or a cluster) of identifiable actions relevant to a function, role or occupation which are measurable. When you combine the descriptions of expected knowledge, skills, competence and range into these units, you have a set of learning outcomes. A qualification unit (learning outcomes unit) is a knowledge, skills and competence set which constitutes a part of a qualification. The unit can be the smallest part of a qualification that can be assessed, validated and (possibly) certified. Therefore, assessment criteria are an essential compulsory component of a qualification unit. It is important to underline that in general there is no one-to-one correspondence between competence units and qualification units. A unit can be specific to a single qualification or common to several qualifications. A qualification is made up of separate qualification units that set out what a learner must be able to do and to what standard. Each unit is like a mini-qualification, representing a training target for building up credits towards a final award, diploma or certificate. The overall qualification is achieved through continuous training and assessment. Assessment is conducted through on-the-job observation, questioning (interview), or other relevant method (cf Chapter 3). Learners produce evidence to prove they have the competence to meet the qualification standards. Assessors sign off units when a learner is ready, after testing the learner’s underpinning knowledge, understanding and work-based competence to see if competence in the workplace can be demonstrated. 12 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority Figure 2 below illustrates relations between different groups of competence standards. Occupational Standard Qualification Competence Unit Qualification Unit Figure 2 Competence standards and frameworks Occupational standards are measurable industry benchmarks in that they specify what workers must be able to do at different levels of qualification or job responsibility. They should relate to recognisable job roles within the sector (industry), identified through a process of functional analysis (cf Section 2.4 below, Development of Competence Standards and Frameworks). A competence unit is the most disaggregated and detailed section of an occupational standards statement, drawn from the functional map and developed to include clear detail of the knowledge, skill and range required for, and delivered by, a qualification unit. Occupational standards are a group of competence units which together circumscribe a recognised job or occupation. The primary purpose of the occupational standards is to set a benchmark of knowledge, skills and competence by which education and training providers can demonstrate that they are qualifying individuals with the correct minimum competence to enter the labour market. These standards allow also for workers to be assessed in the workplace as to their competence against the expected competence described in competence units. For this to happen there need to be competent assessors in the workplace or specialist assessors brought in to observe and assess the competence of an individual. Development of Competence Standards and Frameworks Competence standards are developed by a standard-setting body (SSB) (e.g. Sector Skills Council (UK)), representing employers, industry professionals and educators. At European level, the CEN (European Committee for Standardization www.cen.eu) is a major provider of European standards and technical specifications. It supported the development of the European e-Competence Framework (e-CF; www.ecompetences.eu), a competence standard for the ICT sector developed in cooperation with large companies, both IT vendors/ providers and end-users. The e-CF is one of the frameworks that was used in the VALEW project for the testing phase on recognition of ICT competences. 13 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority An SSB usually manages standard-setting and qualification-awarding processes in a single sector of industry, e.g. transport. Therefore competence standards are typically developed sector-wise. For this purpose, the SSB establishes an expert working group or working groups. Draft competence standards are sent out for wider consultation in the sector. Final amendments based on the results of the consultation are then made before the draft standards are submitted to the SSB for official approval. The methodology applied for the European e-Competence Framework development has been recently documented in the CEN CWA publication "Building the e-CF – a combination of sound methodology and expert contribution" (CEN 2010). The most general and commonly used methodology for competence standards and frameworks development is based on functional and occupational mapping of sectors, subsectors or occupations2. This methodology, described briefly below, has been successfully used in several sectors in the UK and internationally3. Functional and Occupational Mapping The aim of the functional and occupational mapping is to identify the work functions and the occupations within the sector. A sector knowledge, skills and competence framework includes: Functional map covering the whole sector of economy (industry) Framework of competence units developed from the functional map by disaggregating the identified competences to such an extent that they describe work tasks which can be measured, evaluated and linked to specific knowledge and skills Skills which need to be developed and which can be transferred to the workplace Knowledge statements, written as learning outcome statements, which underpin and support the competence and skills framework Common assessment strategy to validate all of the above components In addition it is advantageous to have a detailed body of knowledge extrapolated from the learning outcomes which describe the breadth and depth of knowledge which has to be understood by the individual and which can be tested in a controlled situation The Functional Map is designed to include: Functional analysis that forms the basis for the development of occupational standards as and when appropriate Functional analysis that forms the basis for the development of a labour market investigation Assessment of the impact of national and European legislation and codes of practice, statutory requirements and contractual conditions and the role played by key organisations including trade unions and professional or statutory bodies 14 | 60 2 Occupational and Functional Map of the UK Waste Management Sector. EU Skills – Final Report – Waste Management OFM 24 June, 2005. http://www.ciwm.co.uk/pmm/12160 | 3 Guide to Developing a Sector Qualification Strategy Including: Functional Mapping; a Competence, Skills & Knowledge Framework; a Quality Assurance Process. European Observatoire of Sport and Employment (2007-2008). http://www.easesport.org/ease/fichiers/File/25_mars_2009/eqf_sport_ combined_guides.pdf VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority The Occupational Map is designed to include: Broad job roles, by job category, vocational and academic qualifications, and links between the sector and other related sectors Changes in technology, working practices and market conditions An initial appraisal of the sector’s size, scope and boundaries The functional analysis should be cross-matched to existing occupational standards that relate to job functions on: The role played by qualifications for progression purposes The coverage of existing qualifications and those in development The links between the sector and other related sectors Information sources that have been used to make the analysis that might contribute to further development of occupational standards Functional analysis is a method used to identify the required competences of a productive function by means of a deductive strategy. By concentrating on the functions or outcomes instead of the activities, the descriptions produced are independent of the technology or methods used to achieve the function. In other words, instead of describing what people are doing, functional analysis describes what people have to achieve. The functional analysis process can be applied to a single occupation, a group of occupations, to a sub-sector or to a whole sector. The difference is only one of scale. The same general model for analysis of occupations can also be used as the basis for analysing occupational functions in the entire sector. Most sectors can be broken down into sub-systems. Within each of these sub-systems there are groups of occupations and work functions that have similar process characteristics, even though their application may be different. Design, for example, is a process that occurs in many sectors. A competence unit such as a design skill standard written for one sector can be adapted to meet many different work contexts and applications. Each of these systems’ areas can be further broken down, sometimes through several stages, into work functions for which competence units can be defined. The advantage of this framework approach is that it can be applied to any sector and it enables common functions to be recognised within and between sectors. This considerably lightens the workload in seeking matches with existing work roles and competence units and also introduces a rigour to whole-sector analysis that considerably simplifies and streamlines the process of occupational standards development. The starting point for functional analysis is the development of a general statement – a “key purpose” – which encapsulates what the sector is there to provide. This is subsequently broken down into key areas that represent the sum of broad activities that are necessary to deliver this service. The process of analytical breakdown continues until the activities described represent a reasonable description that could be applied to individual work outputs against which a competence unit can be set. Figure 3 below illustrates a functional mapping approach for the waste management sector. 15 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority The process of functional analysis raises a number of issues relating to the commonality of work functions across the sector. The mapping process goes only as far as defining distinctively different functions. It does not seek to delineate between the same function as it is applied in different sector contexts. For example, the function of “plant maintenance” can apply to a very wide range of different job roles within the sector. The extent to which these variations in context need to be reflected in different standards is an issue that is part of the standardssetting process. Similarly, there are a number of functions for which responsibilities within the work team are shared, for example the function of “Maintaining a safe and healthy working environment”: The Plant Operative will – “contribute to” maintaining a safe and healthy working environment, because the scope of their responsibilities is necessarily limited The Plant Supervisor will – “maintain” a safe and healthy working environment, because, on the ground, they have functional responsibility for action The Plant Manager will – “ensure that” a safe and healthy working environment is maintained, because they are ultimately accountable for health and safety Therefore, from this one function it will be necessary to develop three complementary competence units, probably at three different qualification levels, each reflecting the different level of responsibility. This differentiation is again applied at the standards-setting stage. Setting standards is an expensive, rigorous and time-consuming process involving considerable development work, consultation and testing. There is little purpose in unnecessarily duplicating effort when there are quite valid standards already in place. The final stage of the functional mapping process is therefore to identify all potentially relevant competence units that might be used within a future qualifications framework for the sector. Because of the hierarchical structure of the functional map, each of the functions appears only once. However, in occupations, the same function may be a component of several job roles. Hence there is no direct read-across from the occupational to the functional map. In the same context, many occupations are multi-role and the functions of which they are made up appear in different areas of the functional map. It is at the qualifications design stage that the functional map becomes extremely useful, as it is used as a menu from which functions comprising work roles can be drawn down. 16 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority Example of a functional anaysis: Key Purpose: Provide best value and environmentally compatible waste management services and solutions that optimise the value of waste as a resource, use cost-effective processing and resource management systems that exploit appropriate technologies, employ safe working practices, and consistently meet the needs and expectations of stakeholders, customers and the community within a regulated policy framework. Key Area Key Function Work Function (Unit) 3.1.1 Confirm schedule 3.1.2 Prepare for operations 3.1.3Operate vehicle and ancillary equipment 3.1 Collect Waste 1 Policy 3.1.4 Load, secure and unload materials 3.1.5 Manage spillages 2 Specification 3 Carry out Activities 3.2 Assess Waste 4 Control Programmes 3.1.6 Work with others to solve operational problems 5 Manage the Organisation 6 Maintain Capability 3.3 Identity Waste Figure 3 Functional map for the waste management sector4 The major challenge in establishing a valid and robust occupational map for a sector is to confirm those occupations that are truly unique to the sector. The occupational map will indicate all those occupations within the sector. It will also provide the basis for future research to identify all potentially relevant existing qualifications and where there are current gaps. The map will also identify those occupations for which a given SSB has either a primary responsibility or one that is shared with other SSBs. The occupational map also serves a useful purpose in providing career development information for the sector, enabling the SSB to show the routes and pathways to vocational qualification for each occupation and bring together a database that includes: Academic awards Professional awards Other vocational qualifications Trade certificates Safety certificates Registration certificates Licences The process of developing occupational standards is normally linked to a commitment to develop the related vocational qualifications. However, some sectors have developed the standards for their sector in advance of a clear specification as to how they will be used in 4 Occupational and Functional Map of the UK Waste Management Sector, p.7. EU Skills – Final Report – Waste Management OFM 24 June, 2005. http://www.ciwm.co.uk/pmm/12160 17 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority vocational qualifications. The argument for taking the latter course is primarily because there is a utility in the standards that goes well beyond their use in qualifications. The standards can be used as a basis for recruitment and staff appraisal and for the specification of training within the sector or enterprise and have many other applications as well. Therefore, having a comprehensive framework of competence units from the outset is a major asset. Once the occupational standards development process has been completed, it remains for the standards to be grouped as units and for a structure to be devised for the award that properly reflects the variables in the occupation. Increasingly, vocational awards offer a range of options to take account of variations in the skills mix within jobs. Figure 4 below illustrates the functional and occupational mapping process. Functional Map Prioritise Occupation for Qualification Development Occupational Map Process Controller Core Options Define Work Functions in Occupation Select and Adapt Standards to fit Work Context Define Qualification Structure Figure 4 Developing occupational standards and vocational qualifications5 As part of the occupational mapping process, occupations are broadly classified into groups that reflect the main divisions within the sector. The NQF (national qualifications framework) sets out the levels at which qualifications can be recognised. NQFs are designed to help learners make informed decisions on the qualifications they need, by comparing the levels (and possibly credit values) of different qualifications and identifying clear progression routes for their career. In the case of a qualifications and credit framework, both level and credit value are attached to a competence or qualification unit. The occupational map can only ever be an approximation of the levels-structure within the sector. The actual decision on level comes at a later stage when each occupation is analysed in detail in terms of work functions and defined in terms of competence units. 5 Occupational and Functional Map of the UK Waste Management Sector, p.11. EU Skills – Final Report – Waste Management OFM 24 June, 2005. http://www.ciwm.co.uk/mediastore/FILES/12160.pdf 18 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 2 Identification of Standards Estonian Qualification Authority When the functional mapping is complete it should be complemented with performance criteria, an extension of the functional analysis process which includes three specifications: employment specification (with identification of performance requirement and range), training specification (with identification of skills and knowledge) and assessment specification. Performance requirements are a specification of what has to be achieved in employment to meet the outcomes described in competence units. Range describes the field of application, i.e. the tools, equipment, materials, methods or processes needed to achieve the performance requirements. Skills are what is evidenced to prove performance has been achieved and knowledge describes what one needs to know to achieve performance. The assessment specification (assessment standards, assessment criteria) describes what will be assessed, the quality with which performance has to be achieved. The assessment specification identifies both what the person (employee or learner) must do and the knowledge which must be tested. The assessment specification does not give a detailed description of assessment methods. How the assessment will be made is determined locally. This is accomplished in dialogue between the assessor and the person to be assessed. The dialogue includes agreement about the type of evidence used in the assessment and validation process (cf Chapter 3 below). This is a matter of quality assurance: regardless who awards the qualification, the same assessment standards need to be applied. 19 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 3 Collection of Evidence Fondazione Politecnico di Milano Collection of evidence is one of the most critical steps in the recognition and certification process of competences acquired in non-formal/informal learning environments. In fact, evidence demonstrates the effective mastery of the competences to be recognised. Accordingly, candidates have to understand what pieces of evidence they can provide consistently to show competence dimensions (i.e. technical and behavioural) either in their single aspects or in a holistic way. This phase can be supported by tutors, in case candidates need to move from a tacit knowledge/experience/abilities condition to an explicit one through awareness-raising. In this chapter some main issues related to “evidence” will be discussed and concrete examples of building and evaluating evidence will be shown. These examples refer to the ICT sector with a particular focus on behavioural/soft skills, as planned in the VALEW proposal. In particular in Table 1 (see Annex 1) the European e-Competence Framework (e-CF) and the DOSY Framework (on soft skills) will be integrated and become the reference to collect evidence. (See also Chapter 2, paragraph 2.4.) Accordingly, in this chapter the main focus is on e-competences and ICT skills, where soft skills (transversal competences) are in some way embedded in the e-competences. In fact, “competence” is a holistic concept including all the integrated behaviours necessary to perform. So, while training may identify the components of competences and accordingly develop courses e.g. on skills and knowledge, “recognition” must keep the competence dimensions together, otherwise one is assessing either skills or knowledge, i.e. parts of competence, and not a whole competence. However, in the VALEW Model, the suggested tools and the overall approach may be applied to other competence sectors in addition to ICT. What will change is the specific evidence to be provided. It depends on the specific competence contents to be recognised. The European e-Competence Framework (e-CF) in brief (www.ecompetences.eu) The European e-Competence Framework (e-CF) is a European-wide reference framework of information and communication technology (ICT) competences that can be used and understood by ICT professionals and human resources managers including ICT user and supply companies, small and medium sized enterprises, the public sector, and educational and social partners across the European Union. The framework has been developed by a large number of European ICT and human resources (HR) experts in the context of the CEN Workshop on ICT Skills. 20 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 3 Collection of Evidence Fondazione Politecnico di Milano This European reference framework for ICT competences makes a link between national and company systems. It jointly defines 32 ICT practitioner and manager competences as needed and applied in the workplace. These 32 ICT competences are structured in four dimensions and classified according to their corresponding ICT business areas. They are further specified on five proficiency levels (e-1 to e-5) which are related to the European Qualifications Framework (EQF) levels 3 – 8. This provides a European basis for internationally efficient personnel planning and development. The European e-Competence Framework is supported by the European Commission and the Council of Ministers. The DOSY Framework in brief DOSY is a competence framework representing organisational/transversal competences described through learning outcomes. It derives its name from the European project “DOSY – Developing Organisational Skills for Young Workers” where it was developed by a group of European organisations representing both the labour market and the education and training system. The DOSY approach stated that organisational competences cannot but be contextualised in the specific work process and job environment. Therefore, an effective action to develop them could not be limited to theoretical training, delivered either through open distance learning or in the traditional classroom in a training centre. Likewise daily work activity as such was not enough to develop those competences. Rather a training moment was to be integrated in the work experience, by establishing a training path based on alternating working activities and moments dedicated to the study of “ad hoc” material. These two components had to support each other, allowing the learner to reflect on the work experience and on the working process in which he/she was daily integrated. With this approach in mind, the DOSY project developed a training model for enterprises, targeting young people with an apprenticeship contract, and tested it in several companies. The model revolved around the interplay between daily work experience in the company based on project work (supervised by a mentor within the enterprise) and an individual path where scope was created for reflection on the work experience (and with continuous support offered by a tutor from a training centre).The common backbone of such a training path was a set of six micro-modules (“Self-diagnosis and self learning”; “Positioning inside the enterprise”; “Learning to learn”; “Communication and cooperation”; “Problem solving; Improving your working skills”), representing the soft side of the integrated job activities/context that the learner was experiencing, and against which the development of competences was assessed. This training model permitted the development and assessment of a set of organisational competences as they manifested themselves in the work experience, while helping to “decontextualise” them and develop their transferability to other working/life contexts, to the advantage of the individual. 21 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 3 Collection of Evidence Fondazione Politecnico di Milano Evidence Identification and Collection Procedures Collecting evidence is one of the fundamental steps to be included in any certification process of competences acquired within non-formal and informal learning environments, e.g. on the job. Demonstrating one’s own competences is a necessary starting point. First, some questions must be asked with regard to the meaning of “evidence” and more specifically, the kind of evidence that relates to demonstrating competences. “Evidence” fulfils the need to be as objective and transparent as possible when assessing and certifying competences which are not formally developed under the responsibility of the school system. “Evidence” refers to any objective information showing directly or indirectly competences or parts of them. Candidates have to be able to both preserve and update evidence and to show it to different actors over time. Hence, it has to be transferable and communicable. Examples of this may be e.g. written documents such as reports or emails; multimedia files such as video clips on candidates’ performances; specific products; statements of endorsers etc. Concerning evidence material such as written documents and products, candidates have to be able to clearly indicate their involvement. Their competences vary depending on whether they are creators, developers, supervisors, etc. These examples refer to e-competence but of course they can also be used and applied in other competence sectors. Such evidence has to be documented and provided by individuals specifying if they have developed it by themselves, in a team or by the help of others, etc. Evidence can come from professional contexts within a company but it can also be developed from other contexts, e.g. leisure time. In general, evidence can demonstrate a whole competence or parts of it. Competence is a complex concept which is a combination of hard and soft aspects and is manifested in complex and integrated behaviours, making it a holistic construct. The EQF defines it as follows: “Competence means the proven ability to use knowledge, skills and personal, social and/or methodological abilities, in work or study situations and in professional and personal development”. The definition coming from the European e-Competence Framework (e-CF), agreed by several large companies and countries including France, Germany and the UK, states that competence is “a demonstrated ability to apply knowledge, skills and attitudes for achieving observable results”. Consequently, the e-CF speaks about “e-Competences”, plural, not referring to their components (e.g. skills and knowledge) but considering them as the competence set for the ICT sector. A person can be competent in e.g. “testing” but not in developing a business plan. Being able to develop a business plan is not a skill but a full competence, according to the e-CF description. Both can become discrete jobs. To demonstrate this, there are companies only engaged in e.g. testing ICT products, which is their core, main business. In Table 1 (see Annex 1), each e-Competence is related to DOSY skills to make the soft dimension explicit and the identification of related evidence easier . 22 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 3 Collection of Evidence Fondazione Politecnico di Milano Accordingly, evidence can include all these dimensions (i.e. knowledge, skills and attitudes) or only demonstrate one of them, e.g. a piece of knowledge or a skill. Collecting “holistic” evidence requires practical indications about how and where the different competence dimensions emerge. Consequently, some time will have to be spent highlighting these dimensions. On the other hand, in terms of effectiveness of demonstration of competence, collecting “holistic” evidence is better than identifying “punctual” evidence for each of the competence components. In the VALEW model holistic approaches are therefore preferred and evidence showing whole competences is encouraged. All in all, collecting evidence is quite demanding and the following could represent limitations: Candidates have to become aware of their competences, to reconstruct the history of their experiences and identify links between what is produced in their jobs or activities and the competences to be demonstrated Sometimes candidates might not have saved their work and in consequence not be in the position of finding any pieces of evidence Very often evidence includes confidential information and candidates are not allowed to use company data for personal aims, when the certification is not requested by their company In general, accessibility of evidence may not be easy To make the task less difficult, people should be invited to build personal portfolios and keep them updated over time. Moreover, coaches and counsellors should be strongly recommended to support people in collecting evidence, in order to avoid disappointment at Panel stage where a panel of experts will assess the evidence. They should help them make their implicit know-how explicit and more visible. Getting endorsements is also a very good way of demonstrating one’s own competences and virtual communities can become more and more trustworthy sources of information. Portability of Evidence Type of evidence may change according to the competences to be demonstrated. In general, manual skills need different pieces of evidence from white-collar and managerial ones. For manual competences, video clips are recommended since they may show candidates’ manual abilities. ICT professions can be included in the white-collar jobs where competences are more related to conceptual dimensions. In these cases, excluding direct on-the-job observation, final products and written documentation can be enough. The criticality may be to demonstrate one’s own authorship. On the other hand, communication skills may also be documented by video clips. Currently, the use of e-portfolios and multimedia curricula is growing. 23 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 3 Collection of Evidence Fondazione Politecnico di Milano Evidence Categories and Examples Evidence can be categorized into three types: 1.Formal documentation, e.g. statements, declarations, co-operation contracts, self-statements, diplomas, certifications 2. Outputs, e.g. products delivered by candidates 3. Actions, e.g. observable behaviours Certificates and diplomas, especially if “vendor specific”, are not always of interest to companies which prefer broader experiences. Moreover, declarations and self-statements where good references are reported are not always reliable (this problem is quite common in Germany). Observable behaviours can be useful for manual competences, e.g. in simulation contexts; otherwise they imply a constant observation at the work place. Consequently, products seem to be the most suitable proofs in the ICT context, and accordingly these VALEW guidelines mainly focus on products. Products can be works in process or finished works, and they can be either tangible or intangible. Specimens, letters, minutes of meetings, reports, and ICT programs and routines may also be pieces of evidence. In general, evidence should be: “Natural”, i.e. products delivered by candidates within their work activities or during their leisure time “Genuine”, i.e. actually delivered by candidates “Sound”, i.e. related to the competence or set of competences to be demonstrated “Updated”, i.e. consistent with ongoing innovations, technologies, sectoral improvements etc “Simple”, i.e. easy to build and inexpensive Evidence has to be complemented by a final examination. This may be done through an “oral” exam or a simulation. Assigning Levels to Evidences Proficiency or learning “level” is a basic concept used within reference frameworks (EQF, European e-Competence Framework, DOSY Framework). From the EQF descriptors, it is possible to distil three dimensions defining EQF levels (see Figure 5 below): Autonomy, ranging between “Responding to instructions” and “Making personal choices”, from acting under supervision to acting autonomously with authority and commitment Context complexity, ranging between “Structured – Predictable” situations and “Unpredictable – Unstructured” situations Action verbs (complexity of behaviour), ranging between “operative/routine abilities” and “decision-making/innovative capacities” 24 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 3 Collection of Evidence Fondazione Politecnico di Milano AUTONOMY BEHAVIOUR CONTEXT COMPLEXITY Figure 5 The three dimensions of proficiency-learning levels (User guidelines for the application of the European e-Competence Framework) A correspondence model was developed to make the European e-Competence Framework levels in line with the EQF levels. It is based on these three dimensions mentioned above (i.e. autonomy, context complexity, action verbs). In Figure 6 below a summary is given: e-Competence Level EQF Level 5 8 4 7 3 6 2 4 and 5 1 3 Figure 6 Summary of the correspondence betwen EQF and e-CF levelss (User guidelines for the application of the European e-Competence Framework) Accordingly, each identified e-Competence can be shown at different proficiency levels that are consistent with the EQF levels. With respect to this, each piece of evidence demonstrating an e-Competence at a defined proficiency level has to fulfil the level definition referring to the three dimensions above. In Table 1 (see Annex 1), examples of pieces of evidence related to competences and levels are shown. For each piece of evidence, the three dimensions are to be made explicit. Autonomy dimension is related to the role played by the candidate in the evidence development (e.g. “executor”, “creator”, “supervisor”, etc); while context complexity is related to the features of the project/process/activity expressed by the evidence (e.g. a large, international, multi-stakeholder project is usually less structured and predictable than a local project); action verbs are related to the content of evidence, i.e. the object of evidence can be something “used”, “applied”, “transformed”, “conceived”, “decided”, etc (see also Chapter 5 of these guidelines, paragraph 5.2, EQF Levelling: General Indications). 25 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 3 Collection of Evidence Fondazione Politecnico di Milano How to Build and Evaluate Evidence Table 2 (see Annex 2) shows the excerpt of a tool helping candidates to focus on their competences in order to make them explicit. In addition to this document, which can be filled in or recorded by a descriptive speech, evidence has to be collected. Some examples of pieces of evidence are illustrated in Table 1 (see Annex 1). A coach may help candidates’ awareness process, moving from tacit to explicit knowledge, skills and competences. In general, to identify suitable metrics, i.e. a computation system for the evaluation indicators used in the assessment phase, bottom-up approaches are preferred. A recommended way to proceed is to collect a range of feedbacks from assessors evaluating candidates according to common guidelines. Their judgments and scores can be compared with each other and then normalized to build a reference evaluation standard. Consequently, at this stage, common guiding principles are provided instead of measures established a priori. Hereafter some indicators are provided. They are in line with the EQF dimensions and the related e-CF, as in Table 3 (see Annex 3), and can also be used to define specific learning levels. In fact, according to such indicators, each piece of evidence can be related to specific EQF levels, hence identifying specific levels for the competence/competences it intends to demonstrate. Such indicators are in line with the approach that Lombardy Region, in Italy, is adopting to certify competences acquired in non-formal/informal learning environments. Accordingly, Table 4 (see Annex 4 ) , Table 5 (see Annex 5), Table 6 (see Annex 6), Table 7 (see Annex 7) are addressed to assessors . Table 4 (see Annex 4) and Table 5 (see Annex 5) are used when assessors have to evaluate evidence related to a competence or a set of competences. The indicators listed in the first column of those tables are in line with the EQF and hence can relate competence and evidence to its levels. Table 6 (see Annex 6) and Table 7 (see Annex 7) provide indications for live evaluation, in particular related to an oral examination. However, here the focus is on e-competences in the ICT sector. If the assessment were focused on other competences more related to manual activities, the assessment could be organised with a simulation proof. In the present case, the indicators have to change. All those tools are in line with the approach to the certification of competences acquired in non-formal/informal learning environments currently being used in Lombardy Region. 26 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 4 Recognition, Assessment, and Certification Foundation European Centre Valuation Prior Learning The process of competence recognition and certification is quite complex. It goes through several steps, from the application for recognition, to the evidence collection, the competence evaluation and certification. Each step includes specific actions to be carried on for the specific work profiles of candidates. These profiles have to guarantee the observance of the procedure for quality assurance. In this chapter the whole recognition and certification process, according to a VPL approach, will be shown. It reflects European guidelines and is fully coherent with procedures put into action from other European countries. For this reason it has become the reference for the VALEW Model. The Overall Rationale: The total VPL Process in 5 Phases and 10 Steps The Valuation of Prior Learning (VPL) is the name for a broad view of the implementation of lifelong learning. The target is to change the socio-economic system in a flexible way that adapts itself to the fast-changing needs for competences in the job market. In the context of the present network society, VPL addresses these needs by clarifying the real human potential of competences on the basis of the analysis and recognition of personal competences. VPL offers a strategy for personal development in the context of organisations, governments and private services, and is of crucial importance for individuals to keep pace with the fast competence development in the knowledge society and the increasing speed of competence marketing within the network society. In view of this context, VPL concentrates generally on the needs pertaining to: “I”: me, the individual: I am able to take control of MY own competence development and career to become or stay employable, no matter if I am young or old, the context I’m in, and the way I am learning. “WE”: the communities/organisations I’m in Companies, non-profit and volunteer organizations, private life: OUR organisation will support ME in my learning project, individually or with others, and make it possible for ME to use this to follow MY personal mission and to reach MY goal, while contributing to the larger goal of the community I am working/learning in. 27 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 4 Recognition, Assessment, and Certification Foundation European Centre Valuation Prior Learning “THEY”: the system and its elements EQF, NVQ, VET, guidelines, tools and advice and other services for individuals, labour organisations, local municipalities and welfare care: To support ME and US from within THEIR existing frameworks, so that I can keep on developing and OUR community can also further develop. This becomes ‘the us-level’, the level of collective services that is and must be available for all citizens. VPL in 5 Phases In each VPL system we recognise the parts of a total VPL procedure, which consists in general of five phases: 1. Engagement and awareness of the value of your own competences 2. Recognition (formal and informal) of your competences, by looking back, reflections 3. Valuation and validation of your competences, formal, informal and looking forward 4. (Advice concerning the) development of your competences 5. Structurally taking up your competence-based development process, according to a personal plan perhaps steered through your organisation Together these five phases form the VPL process. The phases are again divided into 10 steps. Below the 10-step model is presented in a table. On the basis of this table, each individual can go through those 10 steps. 1 Awareness 2 Starting/ targets Commitment 3 Preparation & PDP 4 Retro spection Recognition 5 Choose standard 6 7 8 Valuation Validation Prospection Valuation 9 Working on PDP Development VPL is a continuous valuing process. It takes place at all levels, from very informal, such as being appreciated by your colleague or your client, up to the most formal valuation, such as accreditation with a diploma according to national standards. VPL includes stimulating learning, or knowledge development. We call this the broad, or formative, VPL approach. The broad approach is prospective and aims at development. VPL also includes a narrow, summative approach, focusing on an overview of competences and their recognition and validation. The summative approach is also called the retrospective approach of VPL. This chapter of the VALEW guidelines focuses on Steps 6 and 7 in Figure 7 below, the formal recognition of acquired competences. 28 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 10 Empower ment Empower ment 4 Recognition, Assessment, and Certification Foundation European Centre Valuation Prior Learning 1. Informing 2. Application 3. Intake Coach 4. Instruction 6. Assessment 7. Assessment Report 8. Accreditation 9. Certificate/diploma Assessor Administration 5. Portfolio/proof Certifying body 10. Evaluation Figure 7 The accreditation procedure Step 1: Informing The individual should receive detailed information about the procedure the time it will take the instruments to be used the qualification standard to be used the actors involved the code of appeal the costs the requirements for admission (CV etc) any manual for use by candidates the application form Step 2: Application It should be made clear to the applicant where to apply and what the consequences of applying for both the individual as well as the VPL/accrediting organisation are, like a time and payment schedule, quality criteria, and grievance procedure. It is suggested that the total VPL-procedure doesn’t take longer than three months and that a payment schedule is agreed upon, either paying the whole procedure in advance after signing the application or payed in installments like one before entering the Portfolio phase, one before entering the assessment phase and the last installment before the formal accreditation phase. The second payment option could include the option for the individual or the VPL-organisation to stop the VPLprocedure halfway. Step 3: Intake During the intake step the coach will make, together with the individual, a first portfolio, based on readily available information combined with the results of possible competence tests, e.g. (digital) quick-scan. 29 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 4 Recognition, Assessment, and Certification Foundation European Centre Valuation Prior Learning Step 4: Instruction The individual will receive instructions from the coach on how to proceed with completing the portfolio training for tests/assessments that will follow, such as – Competence-based interview – Workplace evaluation – Tests (competence, motivation, personality, theory, capacity, etc) how proof will have to be given and what proof would be valid: for each standard the type of evidence can differ, and more than one type of evidence can be used to prove a competence and/or a level of education or work Step 5: Portfolio/proof In the portfolio, evidence is collected and structured according to the needs of the qualification demand. The question is whether it is possible to present and provide understandable and valid evidence. This is the most difficult part, because when the evidence is created, it is often not known yet for what it will be used. For example, specific activities as a volunteer can at a later stage be relevant evidence for an HRM department, or for an assessment in an industrial sector, or to get an accreditation in the vocational education system or just within the volunteer organisation. The way the evidence is presented and stored in the portfolio needs to be open enough to be recognised by all, yet specific enough to get the highest effect in each of the specific situations. And there are a lot of organisations/companies which still don’t work with portfolios and/or competence profiles. Instruments that can be used: (Digital) portfolio Documentary research CV (Curriculum Vitae) 180-degree feedback The 180 degree feedback is the method in which the employee makes a self-assessment and the manager makes an assessment and gives a feedback on the self-assessment. 360-degree feedback In the 360-degree feedback method, the individual gathers in a systematic way judgements of colleagues, others in the work environment, clients, one’s private environment, as well as a self-assessment. These outcomes are compared with the outcomes of the management, preferably according to a pre-defined list of criteria. This way of working gives a more reliable picture of the reality than only the judgement of the manager. It also gives a better feedback/ insight into bottlenecks in teams and organisations. 30 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 4 Recognition, Assessment, and Certification Foundation European Centre Valuation Prior Learning Step 6: Assessment 1. The assessment can consist of a combination of different types: a. Workplace visit/research, company scan b. Practical assessment/observation c. Role play (on DVD, practice, simulation) d. Portfolio assessment/reflection discussion) e. Criterion-based interview 2.The assessment should be done by a certified assessor, preferably two: one from education/ training and the other from the workplace. Step 7: VPL report The VPL report emanates from the findings of the assessment. The report contains the method used, the material reviewed, the tests made, and all the results. It should contain also a summary and advice for the certifying body. Step 8: Accreditation Based on the VPL report the certifying body substantiates the findings, partly or in whole, and decides which awards, certificates or diploma will be issued. Step 9: Certificate/diploma A formal certificate or diploma is issued. Step 10: Evaluation The evaluation of the whole process takes place, listing points for improvement, which will be fed back into the quality system. Types of Award Awarding may pertain to one or more of five levels: 1. Self estimation and recognition 2. Social/societal recognition (e.g. DOSY) 3. Company/organisational recognition 4. Industrial sector recognition (e.g. e-competences) 5. National vocational qualifications (NVQs) At European level there is the EQF. This is not an awarding system but simply a translation device between different European countries’ NVQs. 31 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 5 Reference to EQF and other existing Qualification/ Competence Frameworks (national, regional, sectoral etc) Scienter; the University of Edinburgh The purpose of this chapter is to show a shared methodology for assigning EQF levels to existing qualifications (or competences)6, as well as to other certificates or forms of recognition of learning outcomes, however acquired, so as to form a bridge – in the most objective possible way – between the informal learning experience and shared codes systematizing learning achievement (and its level). 6 The approach proposed is originated by the work developed by another EQF project supported by the European Commission (€QUALIFISE), active in the financial service sector, in which formal international qualifications exist. An effort has been made here in order to include consideration of less formal (e.g. company-based) forms of recognition of learning outcomes, specifically acquired at the workplace. The EQF-based Approach In the specific framework of the VALEW Model – which is aimed at making visible, recognising, validating and certifying competences informally acquired at work – the importance of framing and translating the informal learning experience into a structured and shared system of learning classification is paramount. It addresses the core issue of how to connect experience acquired at work to formal description and recognition of learning achievement, so as to open up channels to formal education and inter and intra-sector labour mobility, as well as to selfdevelopment projects. The EQF can help structure indicators to position competences onto the EQF levels. Learning outcomes, meant as statements expressing what a learner knows, understands and is able to do (see Chapter 1 of these guidelines and the EQF definition), are the first building blocks of such a bridge. Through the abstractness of language, they permit the expression of the learning result of experience, providing for the basic link to acknowledged codes describing learning/education achievements and professional standards (formal qualifications, national qualification frameworks, job profiles, etc). In this schema, the EQF offers itself as a tool with high potential to “systematize” learning outcomes and ensure them a “relative value” – a position – within a shared learning taxonomy potentially able to encompass all learning contexts as well as the different languages they use. In fact, the EQF was designed to be a translation device to facilitate the comparison of qualifications and qualification levels across Europe, to assist geographical and labour market mobility as well as lifelong learning. 32 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 5 Reference to EQF and other existing Qualification/ Competence Frameworks (national, regional, sectoral etc) Scienter; the University of Edinburgh The ‘level’ of a qualification (or a competence) is an indication of the demands made on the learner, the complexity and depth of achievement and the learner’s autonomy in demonstrating that achievement. The EQF level descriptors provide a general, shared understanding of learning and achievement at each of the eight levels on the Framework and are designed to enable their use across a wide range of learning contexts. Before qualifications (or competences) can be levelled to the EQF, it is usual for them to be expressed in terms of learning outcomes and assessment criteria. As illustrated above, learning outcomes describe what a learner will know, understand or be able to do. Assessment criteria are the set of standards a learner is expected to meet to demonstrate that each learning outcome has been achieved. Each qualification, at whatever level, can be further described in terms of the type of learning achievement. The EQF distinguishes three main learning categories: Knowledge (knowledge and understanding) Skills (application and action) Competence (autonomy and accountability) All qualifications are likely to reflect aspects of these categories, although some will focus on one or two of them rather than all three. Given this architecture, the EQF works as an overarching framework within which all types of qualifications and certificates, and the related qualification systems, might be compared, by means of their expression in learning outcomes and thanks to the ongoing institutional effort of referencing national qualifications systems to the eight EQF levels7. In this context, specific competences and sets of competences acquired informally at work and expressed as learning outcomes are often easily placed (at least partially) within existing qualifications and national/sectoral qualification frameworks where available, and through them, assigned an EQF level. In the absence of such frameworks, or when dealing with more unstructured learning experiences and related learning outcomes, the EQF can residually work as an adequate shared reference framework to attach a level to learning achievements. In this case, however, indicators are needed to support an objective assessment of learning outcomes against the eight levels of the EQF, on the basis of evidences provided. The guidance in the following sections is to help qualification evaluators assign an EQF level to an individual qualification – meant as a set of learning outcomes including different combinations of competences, knowledge and skills, however acquired – including any component units, and to allocate a value to the qualification’s weighting towards each of 7 We refer here to the ongoing effort undertaken by the EU Member States, at different paces, to level their national qualifications systems to the EQF by 2013. Meanwhile, the EQF infrastructure offers to different sectoral, international competence frameworks (such as the European e-Competence framework) the possibility to build on its descriptors and levels to obtain a full transferability and European acknowledgement. 33 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 5 Reference to EQF and other existing Qualification/ Competence Frameworks (national, regional, sectoral etc) Scienter; the University of Edinburgh the three learning categories. Specifically, the VALEW Model proposes the appointment of a panel of experts who will be in charge of the levelling exercise by reflecting individually on the qualification and then together in a meeting which will lead to a decision on the level. EQF Levelling: General Indications In order to ensure the highest degree of objectivity to such a levelling process, especially when informal learning is concerned, some general indications are provided, starting from the appraisal of available reference frameworks (and the related level-assessment criteria), but offering also some parameters and dimensions to consider in order to evaluate more “scattered” learning outcomes and competences against the EQF levels. Given the multi-levelled picture provided above, as well as the heterogeneous setting characterising qualification frameworks and prior-learning recognition across Europe, the VALEW Model proposes three different idealtypic hypotheses referring to a continuum from a well-known set of competences finding full correspondence within existing qualifications, to a highly unstructured set of learning outcomes emerging from working experiences and requiring adequate indicators able to support level-assessment. The first two hypotheses correspond to situations in which the competences acquired informally, and expressed in learning outcomes, match existing qualifications, defined job profiles and fully-developed reference frameworks, referring to both professional and/or institutional standards, either national or sectoral, and already levelled to the EQF. In this case, it is possible to use their descriptors and indicators to assign a level to observed and assessed learning outcomes. Specifically: 1.In the case in which the learning outcomes refer to competences clearly corresponding to existing qualifications within national qualifications systems, the latter provide for indicators and descriptors which help the assignment of levels both within these systems and, indirectly, to the EQF. 2.In the case in which the set of competences under evaluation is more unstructured and does not find correspondence in any existing national qualifications system, it is possible to use the descriptors, indicators and levelling methodology provided by already-developed sectoral, European and international frameworks, e.g. the European e-Competence Framework. 3.The third hypothesis refers instead to more unstructured learning outcomes as emerging from the learning experience that takes place at work. In this case – and as described in Chapter 3 of these guidelines (Collection of Evidence) – some reliable and shared indicators are needed to objectively assess learning against expected standards and attach it a level of proficiency on the basis of evidences provided. (Please see Table 5 (Annex 5) and Table 7 (Annex 7) to these guidelines.) 34 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 5 Reference to EQF and other existing Qualification/ Competence Frameworks (national, regional, sectoral etc) Scienter; the University of Edinburgh In the absence of immediate reference to existing qualifications and the related descriptors, the EQF descriptors offer a set of abstract parameters, expressed in terms of ranges, referring to autonomy level (from none to full) and to context complexity (from highly structured to nonstructured working contexts), and for each of them specific indicators can be identified. (E.g. please see Table 5 (Annex 5) and Table 7 (Annex 7) to these guidelines.) In addition to this process of “operationalisation” of the EQF dimensions, reliable indicators can also be drawn from the common patrimony of prior-learning recognition, starting for instance from the lesson and methodology offered by the Bilan de Compétences. This overall recommendation on how evaluators should behave in order to assess learning outcomes and competence levels against the EQF levels fulfils a triple goal: 1.It allows the highest possible objectivity in assessing learning achievement levels, thereby guaranteeing impartiality, transparency and above all equality of the process, by means of referencing to shared and acknowledged indicators supporting judgement, but likely to fit the specific features of the context. 2.It provides for a cost-effective methodology which, when possible, relies on existing instruments, thereby reducing the risk of arbitrariness of judgement as well as reducing the effort required on the part of the single evaluation panel. 3.The residual employment of the EQF parameters and other indicators avoids bypassing the national and sectoral levels, by exploring in the first instance the availability and suitability of existing national qualifications systems and/or transnational and sectoral frameworks, thereby respecting national and context specificities while adopting their own EQF-levelling system to guarantee European comparability. EQF Levelling: Application in the VALEW Model As regards the levelling of qualifications, it may be found that the application of this VALEW guidance may lead to related and/or unexpected outcomes, e.g.: Checking the internal coherence of a qualification. During the levelling process, it may be found that the learning outcomes and the assessment materials do not match each other – does the assessment provide appropriate evidence that the candidate has fully achieved the learning outcomes and does the qualification deliver what it promises? Testing the adequacy of the assessment tools, the coverage of the assessment criteria and the score necessary for obtaining the qualification Checking external relevance – does the qualification continue to meet market requirements or should it be reviewed? Outcomes like these should also be recorded on the meeting documentation (Form A, Annex 8 to these guidelines) so that, where appropriate, action may be taken by the organisation concerned. 35 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 5 Reference to EQF and other existing Qualification/ Competence Frameworks (national, regional, sectoral etc) Scienter; the University of Edinburgh Prior to the meeting: 1.The process to level each qualification to the EQF should take place at a formal meeting and the decisions and the reasons for those decisions should be formally documented on the templates provided (Form A and Form B). 2.The Panel chosen to assign the appropriate EQF level to each qualification should consist of a reasonable number of suitably competent people – no fewer than three. The Panel’s expertise should encompass the following: a. he content of the qualification under review b. related sector occupation/s c. framework-levelling experience d. training delivery, e.g. teachers or lecturers in the subject e. relevant assessment experience f. non-related sectors and/or qualifications 3. Panel members must be unbiased and capable of acting objectively. They must be jointly prepared to agree and document the final outcome and the reasons for the decisions taken. The Chair of the Panel should be independent from the qualification’s owner to achieve maximum objectivity. 4. It is essential that individual Panel members prepare, prior to the formal meeting, in order to familiarise themselves with: a.all the appropriate qualifications materials, in particular the learning outcomes and any available assessment materials b. the EQF level descriptors 5. This preparation will enable them to make initial decisions (prior to the Panel meeting) on “best fits” for the units or qualifications. Experience shows that the outputs of the Panel meeting will be more effective if this prior work has taken place. Panel members should be informed of the need to complete preparatory work before the meeting and asked to set aside sufficient time. 6. The following materials should be provided to Panel members, with sufficient time before the meeting for individual consideration and at the meeting for group discussion: a. guidance for levelling procedures b. blank Form A and multiple copies of blank Form B c. the EQF level descriptors d.syllabuses for the qualification or for all its constituent units under review (covering the learning outcomes and assessment criteria for each)8 e. assessment materials9 7. Where a qualification is made up of several units (whether mandatory or optional), the whole group of units should be reviewed during the same meeting. Sufficient time should be set aside for the meeting, according to the number of units involved, to enable considered judgements. It is important to stress that qualifications can be levelled by unit or by whole qualification. 8 In case the certification is provided in several countries or the panel is international, the use of the English version is recommended | 9 As above 36 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 5 Reference to EQF and other existing Qualification/ Competence Frameworks (national, regional, sectoral etc) Scienter; the University of Edinburgh 8. Each member of the Panel should record their individual findings (on Form B). If possible, Panel members should provisionally complete Form B before the meeting, as long as they have been able to give consideration to all the available material. In this case the meeting could begin with a discussion about the individual findings of the Panel members. After discussion, the Panel’s final decision will be recorded on Form A together with the reasoning behind the overall decision reached. Form A and the reasoning will serve as an audit trail on how and why each qualification (and its units) has been allocated a particular level within the EQF. During the meeting: 1. Assigning a level is not an exact science and the aim of the Panel should be to find the best fit across the level descriptors. It should be noted that the levelling exercise depends to a certain extent on subjective judgement and it is recommended that the Panel refer to any existing examples from the sector involved, to help in reaching an objective decision. 2. It is suggested that the Panel begin by individual analysis of the documentation (if this work has not been completed before the actual gathering of the Panel). The collective analysis should start with a discussion on the approximate level at which they believe the unit or qualification being reviewed sits. They should then carefully review the EQF level descriptors above, below and at this chosen level to establish its accuracy. The aim will be to decide on the best fit for each learning category for each unit or qualification, i.e. for (a) Knowledge, (b) Skills and (c) Competence. 3. The Panel should review all the learning outcomes and assessment criteria for each unit. An overall level to each type of learning category for each unit should be recorded by each Panel member and then discussed. 4. In the event that different levels for different learning categories are recorded, the ‘weighting’ system based on the values recorded for the weighting of each learning category (described below) can be used to determine the overall level for each unit or qualification. 5. It is possible to denote the scope of the qualification by incorporating the letters K, KS or KSC in the title (to indicate whether it is based on knowledge only, a combination of knowledge and skills, or if it is a full competence qualification), as long as this attribution is explained. It may also be possible to use colours to illustrate the qualification’s scope, i.e.: 1. Knowledge only Level 4 (K) 2. Knowledge and Skills Level 4 (KS) 3. Knowledge, Skills and Competence Level 4 (KSC) It is not recommended that any coding be mandatory but could be used where it helps to distinguish between qualification types. 6. The Panel should weight each qualification in terms of approximate percentages, according to the extent to which its learning is based on (a) ‘Knowledge’/knowledge and understanding, (b) ‘Skills’/application and action, and (c) ‘Competence’/autonomy and accountability. All qualifications are likely to reflect aspects of all three learning categories, although some will focus on one or two rather than all three. For example if in assessing a qualification the Panel considers that it is mostly made up of pure knowledge, with some elements of skill but little or no elements of ‘competence’ to be assessed, they might record the following result: 37 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 5 Reference to EQF and other existing Qualification/ Competence Frameworks (national, regional, sectoral etc) Scienter; the University of Edinburgh Learning Category Knowledge Skills Competence Weighting 90% 10% 0% Level 4 3 n/a Overall Level = 4 Equally, a qualification based on competence assessment might have the following result: Learning Category Knowledge Skills Competence Weighting 10% 20% 70% Level 3 4 4 Overall Level = 4 7. When each unit has been reviewed and allocated a level, the overall level of the qualification can be determined. If, after discussion, a joint decision cannot be agreed, the Panel may decide to record the majority decision. A summary of any major differences of opinion should be recorded. 8. Similarly, when each unit has been reviewed and a percentage weighting for each learning category recorded, the overall weighting for each learning category can be determined. If, after discussion, a joint decision cannot be agreed, the Panel may decide to record the majority decision. A summary of any major differences of opinion should be recorded. Further Considerations Other considerations and comparisons with previously levelled qualifications include: 1. As illustrated in the introduction above, the Panel should preliminarily consider where the learning outcomes position themselves on the abovementioned continuum, from full correspondence in existing national qualifications frameworks to non-structured learning outcomes requiring the deployment of a set of indicators able to objectively support levelassignment against the EQF descriptors. In the latter case, the EQF dimensions described in Chapter 3 of these guidelines – including context complexity, level of autonomy and behaviour and the related EQF levels – can be investigated through some specific indicators able to assess whether the competences informally acquired imply: full to no autonomy non-structured to highly structured context degree of risk management (from high to very low) 2. It may be useful to consider the typical candidate taking the qualification, including their experience and job responsibilities whilst acknowledging, for example, that not all qualifications for senior personnel are necessarily at a high level. 3.It has been found that the translated/national version of the EQF level descriptors does not always properly reflect the English (master) version. It may be very useful to translate into the national language the guidelines given here, and to use translated forms for the levelling process, so that any language difficulties can be addressed for the country concerned. Problems with mis-translated level descriptors should be brought to the attention of the relevant national authorities. 38 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 5 Reference to EQF and other existing Qualification/ Competence Frameworks (national, regional, sectoral etc) Scienter; the University of Edinburgh 4. It may be useful to consider whether the qualification has entry requirements or is a requirement for progression, as this may also provide an indication of the level. Again, it should be noted that progression qualifications are not necessarily at a higher level. 5. The Panel should consider the assessment materials or evidence used, including any assessment strategy and any learning materials available. 6. Sometimes the qualification might be a requirement for a certain (regulated) job role in which case this could give an indication of the expected level of ‘competence’ required, but the qualification requirements as a whole should be taken into account in the levelling decision. 7. It should be noted that, where a qualification is not made up of units, it can be broken down into smaller logical parts for the purposes of the levelling exercise to help the levelling decision process. 8. The definition of competence in the EQF level descriptors is based on autonomy and accountability. It should be noted that, in levelling decisions where autonomy and accountability are not relevant, it may be helpful to use the wider EQF definition of “competence”, meaning: ‘the proven ability to use knowledge, skills and personal, social and/or methodological abilities in work or study situations and in professional development’ ** The European Qualifications Framework for Lifelong Learning (EQF) Luxembourg: Office for Official Publications of the European Communities 2008 Recording decisions: 1. Form A (Annex 8 to these guidelines) is designed to record the full Panel’s decisions on the qualification under review (on the first page) and the final decisions on the allocation of level and learning category weighting value (on the second page). Each field can be extended as necessary to record as much information as is felt necessary. 2. Form B (Annex 9 to these guidelines) is designed to record individual Panel members’ judgements on qualifications or individual units, prior to discussion with the rest of the group. This is the working form, which can be copied as many times as necessary prior to the meeting. 3. The first field of each form, Panel Reference, is to record the number or reference of the Panel meeting, partly so that related pages can be kept together easily (where there may be a series of Panel meetings to review different qualifications). 4. Additional space for comments on the front of the form should be used for any other relevant information about the Panel or the qualification. For example, is the qualification a regulatory requirement or a ‘licence to practice’? A representative of the regulatory body in this case should be part of the Panel. 5. It will be important that the Panel document their reasons for assigning a particular level to a qualification (and its units) in a way that is transparent to an independent auditor. It is suggested that one way to justify a decision might be to refer to the descriptors in the level above and below and explain why they are not suitable. 6. After the Panel have reached a final decision and have agreed on the comments and values logged on the master qualification template form (Form A), all members should add their signatures against their names on the front of the form. 39 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 6 Ensure Recognition IG Metall; ARIFL; Scienter España; Fondazione Politecnico di Milano Competence recognition and certification as a concept and set of practices must first be recognised by the labour market, i.e. shared and agreed to by companies and all the organisations potentially interested in competence certifications: VET institutions, public administrations, etc. The more such bodies can recognise competence certification systems, the more the value of these systems (their acknowledgement and reputation) will increase and “virtuous circles” grow. In this chapter, this issue will be discussed taking into account the stakeholders, how to attract them, and the enabling factors for success. European studies and research on validation of workplace learning show that from a practical standpoint, a wide set of technical instruments and methodologies is now available to fulfil the objective of making informal learning observable, measurable and classifiable, making it possible to have reliable devices for validating informal learning at work. The different chapters of these guidelines provide indication on which steps to undertake and which methods to adopt in order to create such a system and equip it with the necessary transparency and objectivity to fulfil a quality assurance aspiration. It is worth acknowledging however that, although the design of recognition systems as a technical process can rely on a wide set of methodologies developed across Europe, the recognition of learning outcomes at the workplace is not yet a generalised phenomenon. The explanation for this needs to be sought within the other milestone of a validation system, the consensus required on the political level by the actors of the system as a whole. In that respect it seems that the overall usefulness and relevance of recognition systems is not perceived enough or actively acknowledged by key social constituents. The following paragraphs analyse the enhancing and inhibiting factors which key players face when validation of learning at work is at stake, and outlines which key conditions allow for a systemic endorsement of a validation system and its institutionalisation. Which Players Can Recognise and therefore Add Value to a Certification? In an ideal case, a marketable competence certification should be recognised by a variety of players and from a variety of perspectives. Together these perspectives should cover both the formal and informal dimension of recognition, which are both equally important for the successful implementation of a competence certification on the labour market and over time. 40 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 6 Ensure Recognition IG Metall; ARIFL; Scienter España; Fondazione Politecnico di Milano To cover the formal dimension, the following players come into consideration: 1. public authorities (e.g. ministries, national qualifications authorities, regional governments etc): These players can ensure recognition of a certification from a formal and educational systems/sector approaches point of view. 2. sectoral and professional bodies with recognised certification functions 3. education and training providers: Mutual recognition of certification and diplomas by different educational and training players helps to increase mobility, transparency and interoperability in professional education and training, and thus acceptance and accessibility of specific certifications. To cover the informal dimension, the direct players within the labour market are: employers, who are the indirect beneficiaries of certifications. To ensure the value of certifications on the labour market, it is crucial that they are well known and accepted by employers. This acceptance can be achieved by formal support (see above); nevertheless the informal dimension of recognition by employers is often driven by its own dynamic. direct end-users, who are the certification candidates and holders. They have to feel motivated to acquire the certification by seeing a real added value for their personal and/ or career development. Such added value can only be transmitted if strong support by the other players is provided. labour unions, which are the legitimate political entity for dialogue and cooperation between labour, employers and the state. They are a building block in the creation of a system of validation which requires the commitment of all social partners. In that respect, the viability and success of informal-learning validation at work is conditional upon an overall climate of co-operation between the social forces, and specifically labour and capital. High quality and market value of the certification in terms of content, method, etc is a prerequisite to facilitate recognition among all the above-mentioned parties. How such a level of quality can be ensured is covered by previous chapters and is therefore here taken for granted. Attractiveness of and Resistance to Validation of Learning Outcomes at Work from the Various Stakeholders’ Perspectives The actors listed above have different interests and are constrained as institutions between change and conservation, according to an incentive-disincentive dynamic. According to this, validation of informal learning and specifically of work-based learning is perceived as both attractive and problematic or even threatening for different reasons. Public authorities might be willing to implement validation systems for workers through an active welfare policy which feeds into lifelong-learning realisation. In terms of public spending, recognising competences acquired through non-formal paths would moreover allow for cutting off the direct costs of continuing vocational training and social subsidies for unemployment. Building a sound infrastructure able to back the validation system would require however a significant effort, both organisational and economic, of a systemic breadth. It would entail a certain degree of transformation of the public institution itself to comply 41 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 6 Ensure Recognition IG Metall; ARIFL; Scienter España; Fondazione Politecnico di Milano with the new pattern of services provided to society. At a higher and more political level, governments and public authorities need to account to, and maintain a consensus with, different actors and ensure a balance of interest among legitimate social constituents. They might be therefore in a sensitive position with respect to some expected resistance from formal education systems and their providers, who are happy to maintain a certain monopoly on education provision and learning recognition. Education and training providers might to a certain extent benefit from systems of validation of informal learning (including workplace learning) able to enhance mobility among and towards learning paths, providing them with new “customers” and designing an expanding role for education and lifelong learning in society. However, widening the borders of learning provision within the jurisdictions of its traditional agents might be perceived as a threat, while concerns around the quality of non-traditional education create resistance to what might be seen as lower access requirements to formal paths. At a micro level, while the commitment of employers and employers’ associations is pivotal to turn work-based learning validation into practice, some structural factors downplay their interest in supporting validation when not creating explicit resistance. Workplace training seems to some extent irreconcilable with the employability/lifelong-learning development vision embedded in the validation discourse. While the latter requires a focus on transversal skills and the transferability/portability of competences across segments of the labour market, enterprises tend to invest in developing skills for the immediate usage of the specific internal productive process. In general, employers are not always conscious of the benefits of continuous training, including the recognition of competences at work, due to high turnover rates, which scatter investment in HR development, in the perception of managers. More specifically, companies are highly concerned on one hand with the loss of their human capital, and on the other with the burden of incoming contractual obligations. Many enterprises would be highly interested, however, in using attuned validation practices serving their needs, as well as those of their employees, with a variety of purposes. A company might gain in external visibility and quality assurance by presenting itself in a commonly accepted language to customers and partners, while better managing its own human resources. Companies are particularly interested in certifications of informal learning in the workplace when it comes to recruitment, seconding and outsourcing, or when potential clients ask for secondees to be certified, especially in the ICT sector (not necessarily with vendor-specific certifications). Moreover companies can be interested in such certifications to recognise and certify competences acquired during apprenticeships (beyond specific training measures). In general large companies are at present not very interested in such certifications for their employees because on one hand they are afraid of possible claims from certified employees (e.g. career or salary improvements), and on the other hand they often already have in-house assessment tools and procedures, customized for their needs. By contrast, SMEs have not usually got such internal instruments and could be supported by external certification devices. Unfortunately, SMEs are not usually aware of the importance of competence recognition and development. In this sense, it would be important to develop communication campaigns and learning measures to enhance their culture and their willingness to pay attention to such opportunities. Another issue in this context is related to costs, i.e. who pays. In fact, companies (both large ones and SMEs) are not very keen to support financially such external certifications. Moreover some of them state that certifications should be based on reputation; that is, they 42 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 6 Ensure Recognition IG Metall; ARIFL; Scienter España; Fondazione Politecnico di Milano should be issued by acknowledged bodies and based on quality assurance in their processes and procedures. The network of stakeholders believing in and supporting the validation of informal learning at work is of great importance to raise perceptions of reliability. The enterprise system as a whole would benefit from an active policy device contributing to making the system more flexible, thereby favouring offer-demand matching in the labour market. In this sense, the existence of national or sectoral standards as well as competencebased national qualification frameworks provides scope for validation of informal learning by offering already-developed tools. Such tools allow companies to cut costs of standards development while also helping in the identification of skill shortages and future competence needs in the economic system, so as to manage, in a competitiveness perspective, both the contingent and future HR and development strategies of companies. The existence of such a system is fundamentally attractive for the individual worker. In terms of transferability, the possibility to “speak a common language” allows for validation to produce self-standing outputs which can be utilised outside the company in an employability perspective, as well as in a lifelong-learning project in education and training paths. Validation mechanisms offer individuals chances for building a flexible competence capital to be usable within their workplace as well as portable across their lifelong learning and professional experience. Procedures need however to be reliable, transparent and accountable, since the individual must be able to trust the company with regard to his/her evaluation and the consequence it could bring to the individual’s position in the organisational chart. In that respect not all workers are the same, as different professions and positions in the labour market also entail different interests and incentives to undergo a validation process. Professional and sectoral bodies, representing specific categories of qualified workers, have a strong interest in certification systems able to contribute to professionalisation and career paths within the specific sectors, as well as to internal and external competitiveness. As certifying bodies, they might moreover be interested in producing their own certification label which regulates the profession and is a guarantee of quality in the labour (and sell/buy) market. This in turn raises questions with regard to the extent to which a one-size-fits-all public policy on informal learning recognition (i.e. a national validation system linked to an NQF) might be perceived as inappropriate and overbearing for these sectoral and professional bodies that traditionally lean towards self regulation. From the point of view of the candidates for certification, the position of the labour union is a sensitive one. In an evolutionary scenario of changing industrial relations, the labour unions hold a strong interest in maintaining and expanding their areas of jurisdiction so as to increase their manoeuvring scope and raise their political importance as the sole legitimate workers’ representative in the political arena. As a new area for bargaining between employers and labour (and the state), the validation and recognition of informal learning at work represents an important playing field where employees might be able to gain in rights, opportunities, improved conditions and power. It is in particular in terms of wage policy that this advantage is perceived, though the overall implications of an institutionalisation of work-based learning are wider and include both opportunities and threats to the labour union position, i.e. an increasing shift from collective to individual bargaining. 43 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 6 Ensure Recognition IG Metall; ARIFL; Scienter España; Fondazione Politecnico di Milano Key Factors of Success To achieve a broad recognition of a competence certification, early involvement of relevant players in the certification establishment and recognition process is crucial. As early as possible, relevant stakeholders should be identified. For coherent market implementation on national/ EU levels, the highest possible level of certification activity, with good contact to the user base (employers and professionals), should be ensured. Who should be involved concretely would vary according to the focus of the certification and should be analysed carefully in each specific case. Depending on the certification focus, relevant players could be e.g. chambers of commerce, VET or higher education networks, ministries, employers plus professional associations, and trade unions. A successful example from Lombardy Region counted on working together with professional associations in both the definition of competences and the assessment of VET final exams. Once the competence certification is available within the labour market, a combined effort by all players involved should be made in order to reach as soon as possible a critical mass of certifications awarded in a reasonable time and a provable recognition and acceptance of the certification by employers. To achieve this aim, establishing right from the beginning a well-functioning and thoroughly visible service infrastructure for assessment, certification and accompanying supporting/consulting processes is a necessity. To achieve a provable recognition within the labour market by employers and certification candidates, rewarding approaches for certified competences can be fostered from a formal perspective. For example, in a competence certification system that was developed by Lombardy Region in 2007, the number of formative credits that are to be obtained for a course could be “discounted” for people who have certified competences in related subjects. To ensure recognition and flexibility, competence certifications could be included in ECTS/ECVET or other translation devices which ensure connections between training and certification paths. Moreover, to ensure recognition of a certification over time, the establishment of a re-certification process, e.g. every three years, might be considered. Finally, a tailored communication strategy should be developed and implemented in parallel, addressing the variety of players and perspectives involved, in order to support the positive image-building of the certification. To ensure straightforward communication and an easily accessible service infrastructure, a responsible body should be identified which can be contacted by all. A well-suited communication strategy would include: certification marketing towards end-users (easy comprehensiveness of process and benefits) certification marketing towards indirect beneficiaries (quality of certification outcomes in work environment and benefits) a lead by a responsible body, supported by other players involved The overall process could be supported in practical terms by promoting a standardized tool for proving personal competences. 44 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 6 Ensure Recognition IG Metall; ARIFL; Scienter España; Fondazione Politecnico di Milano Readiness Index and Proposal to Check Progress against Generalised Validation and Recognition of Learning Achievement in the Workplace A careful analysis, taking into account the inhibiting and enhancing factors listed above – as well as both the existing conditions and the specific actions needed to tackle resistance and commit all relevant actors – provides an insight into the readiness of the system to institutionalise (and formally reward) the recognition of work-based learning. The necessity of involving “opposite” stakeholders so as to build the system from the bottom up, with consensus and shared effort as a basis for success, requires taking into account the positions of each so as to appreciate fully the viability of a validation system in given conditions. To this aim, a readiness index has been developed by the VALEW project, to collect, rank and compare the different stakeholders’ perspectives/perceptions on the following dimensions: 1. need perceived (for a validation system) 2. awareness of possible solutions 3. willingness to commit 4. willingness to “pay” 5. existing professional capacity in the system The index allows the degree of system-readiness (at a national or a more specific level) to be rated on each dimension (from 1 – very low, to 5 – very high), so as to calculate the overall readiness of the system with regard to the recognition of informal learning at work, taking into account, as backdrop information, the actual use and the legislative/regulation conditions, as well as the availability of a national qualifications framework. Such an index (and above all the process of compiling and interpreting it) could also be used as a starting point for discussion among different stakeholders in a specific system, as a working tool to undertake the process of design and implementation of a validation system, to collect different views and needs, understand problems and opportunities and investigate the viability of the system. 45 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 6 Ensure Recognition IG Metall; ARIFL; Scienter España; Fondazione Politecnico di Milano Readiness Index Potential Users (Individuals) Employers Stakeholders Policy-Makers ---- Very low very high ---- Very low very high ---- Very low very high ---- Very low very high high Need perceived Awareness of possible solutions Willingness to commit Willingness to pay Existence of professional capacity in the system (as perceived) Actual use (as perceived) Legislative/Regulation conditions Qualification framework in place Total Readiness index: 46 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 Total 6 Ensure Recognition IG Metall; ARIFL; Scienter España; Fondazione Politecnico di Milano Moving Forward A first piloting of the readiness index in the countries involved in the VALEW project has provided some interesting, though not fully representative, insights into the state of the art of European systems with regard to validation of work-based learning. What emerges is that even in the most advanced systems in terms of validation, things are moving slowly and obstacles exist on the way to full acceptance of informal learning recognition. It is however true that a widespread interest exists today among stakeholders, while more general reforms in the policy panorama, including education and training and welfare and employment policy, are progressively moving towards the creation of approaches, mindsets and supporting infrastructures to allow for more dialogue among learning systems and the labour market. In this scenario, the development of the European Qualifications Framework, although not representing a panacea, is an important step in the passage, from segmented education systems based on learning inputs, towards lifelong and life-wide learning systems that use competences as their currency and learning outcomes as their shared language. 47 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 7 Annexes Chapter 3: Collection of Evidence Annex 1: Table 1 “e-Competences, DOSY skills, levels and pieces of evidence” This annex is related to one of the tools developed during the VALEW project. Annex 2: Table 2 “Making competence explicit” Annex 3: Table 3 “Correspondence model EQF, e-CF levels (from CWA 2008 The European e-Competence Framework)” Annex 4: Table 4 “Evidence Assessment Grid”; Annex 5: Table 5 “EQF Indicators related to the Evidence Assessment Grid” Annex 6: Table 6 “Oral Examination Grids” Annex 7: Table 7 “EQF Indicators related to the Oral Examination Grids” The second and the last four annexes are related to some tools currently used in Lombardy Region approach to the certification of competence acquired in non-formal/informal learning environments. Chapter 5: Reference to EQF and other existing Qualification/ Competence Frameworks (national, regional, sectoral etc) Annex 8: FORM A “Master template for allocating EQF level and learning category weighting to OVERALL QUALIFICATION” Annex 9: FORM B “Working Template for allocating EQF level and learning category weighting to INDIVIDUAL UNITS” 48 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 e-Competence A.3. Business Plan Development Addresses the design and structure of a business or product plan through the identification of alternative approaches; includes return on investment propositions. Presents cost benefit analysis and reasoned arguments in support of the selected strategy. This activity may include the development of enterprise-wide information architectures and processes to ensure strategic application of technology for business benefit. Ensures compliance of business and technology strategies. Communicates and sells business plan to relevant stakeholders and addresses political, financial, and organisational interests, including SWOT analysis. For product plans this will incorporate the marketing and sales strategy and potential use of Value Added Resellers (VARs). B.3. Testing Constructs and executes systematic test procedures for IT systems or customer usability requirements to establish compliance with design specifications. Ensures that new or revised components or systems perform to expectation. Ensures meeting internal, external, national and international standards including health and safety for either usability, performance, reliability or compatibility. Produces documents and reports to evidence certification requirements. EQF Level 7 5 2 Organises test programmes and builds scripts to stress-test likely vulnerabilities. Records and reports outcomes providing analysis of results. 4 Provides leadership for the creation of an information systems strategy which meets the requirements of the business. e-Competence Level C.L1 read and identify the main points and ideas from documents about straightforward subjects; write about straightforward subjects. WWO.L1 understand what needs to be done to achieve these objectives; carry out tasks to meet your own responsibilities; IOLP.L1 understand the short-term targets you have been given, and plan how these will be met; follow your plan to meet targets and improve your performance; review your progress and achievements. L2 use your plan to meet targets, identifying when you need support from others. C.L3 create opportunities for others to contribute to group discussions about complex subjects; organise information coherently, selecting a form and style of writing appropriate to complex subject matter; read and summarise information from extended documents; make a presentation using a range of techniques to engage the audience. WWO.L4 develop a strategy for using the relevant key skill over an extended period of time, and plan how you will do this; monitor progress, critically reflect on your performance in using the relevant skill, and adapt your strategy, as necessary, to achieve the quality of outcomes required; evaluate your overall strategy and present the outcomes from your work, including ways of further improving your skills. IOLP.L3 agree targets and plan how these will be met over an extended period of time; use your plan, seeking feedback and support from relevant sources, to help meet targets; seek information from relevant sources to establish evidence of your achievements, when reviewing your progress. DOSY Level Technical documents Technical reports Typology Performance Technical documents Analysis / Ethical Technical reports Hacking procedure Endorsement documentation Project brochures; CRM project: Feasibility Study; Comparative analysis between systems; Reports on partners’ feedbacks; Product presentations; Documentation on the application system Evidence How to collect Creator Under general supervision Structured-unpredictable „Making“, „organising“, „using“ (tools for testing), „building“ (reports), „testing“, „writing“ (technical documentation), „defining“ (Code review elements), „analysing“, „developing“, „programming“, „discovering“. Designing, managing, surveying, monitoring, evaluating, improving, finding non-standard solutions Original thinking, problem solving, taking care and responsibility, developing, innovating, managing, transforming; Planning, making decisions, supervising, building teams, reviewing performances Complexity (of projects/ Behaviour/Action Verbs activities described in (gathered from the evithe evidence) dence) Creator Structured-unpredictable taking responsibility Role in the drafting of evidence 7 Annexes Annex 1: Table 1 “e-Competences, DOSY skills, levels and pieces of evidence” 49 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 D.4. Purchasing Applies a consistent procurement procedure, including deployment of the following sub processes: specification requirements, supplier identification, proposal analysis, contract negotiation, supplier selection, contract placement. Ensures that the entire purchasing process is fit for purpose and adds business value to the organisation. C.1. User Support Responds to user requests and issues; records relevant information. Resolves or escalates incidents and optimises system performance. Monitors solution outcome and resultant customer satisfaction. 6 4 e-Competence EQF Level 2 Understands and applies the principles of the procurement process; places orders based on existing supplier contracts. Ensures the correct execution of orders, including validation of deliverables and correlation with subsequent payments. 3 Manages others’ activities and is accountable for ensuring that agreed service levels are met. Plans resource allocation to ensure support is available within user business hours. Acts creatively, and seeks opportunities for continuous service improvement by analysing root causes. Manages costs to budget. e-Competence Level C.L1 take part in discussions about straightforward subjects; read and identify the main points and ideas from documents about straightforward subjects; write about straightforward subjects; make a presentation about knowledge of a subject. WWO.L1 understand what needs to be done to achieve these objectives; carry out tasks to meet your own responsibilities; organise tasks to meet your responsibilities and work co-operatively with others. IOLP.L1 understand the short-term targets you have been given, and plan how these will be met; follow your plan to meet targets and improve your performance; review your progress and achievements. L2 identify evidence of your achievements when reviewing your progress. C.L2 help move discussions forward; give a short talk using an image to illustrate your main points; read and summarise information from extended documents; use a suitable structure and style when writing extended documents. L3 create opportunities for others to contribute to group discussions about complex subjects; make a presentation using a range of techniques to engage the audience. WWO.L2 plan what needs to be done to achieve these objectives and clarify responsibilities; organise tasks to meet your responsibilities and work co-operatively with others. L3 seek to establish and maintain co-operative working relationships, agreeing ways to overcome any difficulties; review work with others, including factors that influenced the outcome. IOLP.L2 help set short-term targets and plan how these will be met; use your plan to meet targets, identifying when you need support from others. L3 seek information from relevant sources to establish evidence of your achievements, when reviewing your progress. DOSY Level Typology Performance data related to the use of order process applications; Reports on the matching of parameters required by the order process and those ones achieved Formal documentation; Demo; Endorsement Help Desk Formal and technical management; documents Call Centre Technical reports management; Trouble ticketing; SLA execution; Audit on achieved performances compared against service level agreements; Customer satisfaction surveys Evidence How to collect Under general supervision Taking responsibility Role in the drafting of evidence Structured-predictable Structured-unpredictable Scheduling, organising, integrating, finding standard solutions, interacting, communicating, working in team Supporting, controlling, monitoring, investigating, taking care and responsibility, measuring, improving, managing. Planning, making decisions, supervising, building teams, training people, reviewing performances, finding creative solutions by application of specific technical or business knowledge/skills Complexity (of projects/ Behaviour/Action Verbs activities described in (gathered from the evithe evidence) dence) 7 Annexes Annex 1: Table 1 “e-Competences, DOSY skills, levels and pieces of evidence” 50 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 e-Competence E.3. Risk Management Implements the management of risk across information systems through the application of the enterprisedefined risk management policy and procedure. Assesses risk to the organisation’s business, and documents potential risk and containment plans. EQF Level 6 3 Decides on appropriate actions required to adapt security and address risk exposure. Evaluates, manages and ensures validation of exceptions; audits ICT processes and environment. e-Competence Level C.L2 help move discussions forward; give a short talk using an image to illustrate your main points; read and summarise information from extended documents; use a suitable structure and style when writing extended documents. L3 create opportunities for others to contribute to group discussions about complex subjects; make a presentation using a range of techniques to engage the audience. WWO.L2 plan what needs to be done to achieve these objectives and clarify responsibilities; organise tasks to meet your responsibilities and work co-operatively with others. L3 seek to establish and maintain co-operative working relationships, agreeing ways to overcome any difficulties; review work with others, including factors that influenced the outcome. IOLP.L2 help set short-term targets and plan how these will be met; use your plan to meet targets, identifying when you need support from others. L3 seek information from relevant sources to establish evidence of your achievements, when reviewing your progress. DOSY Level Organisation and implementation of company‘s business data back-up; Procedures of authentication of data; Audits Evidence Technical documentsTechnical reports, Email, Demo, Endorsement Typology How to collect Understanding, applying, adapting, investigating, deploying, managing, taking responsibility, consulting. Planning, making decisions, supervising, building teams, training people, reviewing performances, finding creative solutions by application of specific technical or business knowledge/skills Complexity (of projects/ Behaviour/Action Verbs activities described in (gathered from the evithe evidence) dence) Concept Structured-unpredictable creator (whole document or some parts of it); Supervisor; Reviser Role in the drafting of evidence 7 Annexes Annex 1: Table 1 “e-Competences, DOSY skills, levels and pieces of evidence” 51 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 7 Annexes Annex 2: Table 2 Excerpt from the document “Making competence explicit” (being adopted by Lombardy Region) Making competence explicit Recognition of competence (fill in the name of the competence): ............................................. Tell me what you can do and in what way Imagine having to convey this competence to someone who wants to acquire it. Explain what this competence is. How do you practice this competence? ................................................................................................................................................. ................................................................................................................................................. ................................................................................................................................................. How did you get this competence? (Training courses, self-training, experience in the field) ................................................................................................................................................. ................................................................................................................................................. ................................................................................................................................................. Describe some contexts (private or public companies; large, medium or small enterprises; nonprofit associations; leisure; activities, processes, projects) where you practised this competence, specifying times and terms ................................................................................................................................................. ................................................................................................................................................. ................................................................................................................................................. Describe briefly the most significant experience in which you practised this competence ................................................................................................................................................. ................................................................................................................................................. ................................................................................................................................................. Concerning the experience described above, did you practise this competence individually or working in a team? ................................................................................................................................................. ................................................................................................................................................. ................................................................................................................................................. 52 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 7 Annexes What was your role? ................................................................................................................................................. ................................................................................................................................................. ................................................................................................................................................. Concerning the experience described above, how did you manage the activities related to this competence? (E.g. did you plan activities? Did you identify some priorities?) ................................................................................................................................................. ................................................................................................................................................. ................................................................................................................................................. Concerning the experience described above, which tools did you use? ................................................................................................................................................. ................................................................................................................................................. ................................................................................................................................................. Thinking about the experience described above, which risks might you contemplate? (Give some examples) ................................................................................................................................................. ................................................................................................................................................. ................................................................................................................................................. Thinking about the experience described above, which problems might occur and how might you manage them? (Give some examples) ................................................................................................................................................. ................................................................................................................................................. ................................................................................................................................................. Thinking about the experience described above, which mistakes might you make and how might you prevent them? (Give some examples) ................................................................................................................................................. ................................................................................................................................................. ................................................................................................................................................. How might you correct them? (Give some examples) ................................................................................................................................................. ................................................................................................................................................. 53 | 60 ................................................................................................................................................. VALEW Model Guidelines – How to validate learning at work July 2010 4 3 Highly specialised knowledge, some of which is at the forefront of knowledge in a field of work or study, as the basis for original thinking, critical awareness of knowledge issues in a field and at the interface between different fields, specialised problem-solving skills in research and/or innovation to develop new knowledge and procedures and to integrate knowledge from different fields, managing and transforming work or study contexts that are complex, unpredictable and require new strategic approaches, taking responsibility for contributing to professional knowledge and practice and/or for reviewing the strategic performance of teams Advanced knowledge of a field of work or study, involving a critical understanding of theories and principles, advanced skills, demonstrating mastery and innovation in solving complex and unpredictable problems in a specialised field of work or study, management of complex technical or professional activities or projects, taking responsibility for decision-making in unpredictable work or study contexts, for continuing personal and group professional development Comprehensive, specialised, factual and theoretical knowledge within a field of work or study and an awareness of the boundaries of that knowledge, expertise in a comprehensive range of cognitive and practical skills in developing creative solutions to abstract problems, management and supervision in contexts where there is unpredictable change, reviewing and developing performance of self and others Factual and theoretical knowledge in broad contexts within a field of work or study, expertise in a range of cognitive and practical skills in generating solutions to specific problems in a field of work or study, self-management within the guidelines of work or study contexts that are usually predictable, but are subject to change, supervising the routine work of others, taking some responsibility for the evaluation and improvement of work or study activities Knowledge of facts, principles, processes and general concepts in a field of work or study, a range of cognitive and practical skills in accomplishing tasks, problem-solving with basic methods, tools, materials and information, responsibility for completion of tasks in work or study, adapting own behaviour to circumstances in solving problems 7 6 5 4 3 1 2 5 e-Comp Levels Knowledge at the most advanced frontier, the most advanced and specialised skills and techniques to solve critical problems in research and/ or innovation, demonstrating substantial authority, innovation, autonomy, scholarly or professional integrity EQF Level descriptions 8 EQF levels Able to apply knowledge and skills to solve straightforward problems; responsible for own actions; operating in a stable environment Associate Operates with capability and independence in specified boundaries and may supervise others in this environment; conceptual and abstract model- building using creative thinking; uses theoretical knowledge and practical skills to solve complex problems within a predictable and sometimes unpredictable context Professional Respected for innovative methods and use of initiative in specific technical or business areas; providing leadership and taking responsibility for team performances and development in unpredictable environments Senior Professional/Manager Extensive scope of responsibilities deploying specialised integration capability in complex environments; full responsibility for strategic development of staff working in unfamiliar and unpredictable situations Lead Professional/Senior Manager Overall accountability and responsibility; recognised inside and outside the organisation for innovative solutions and for shaping the future using outstanding leading edge thinking and knowledge Principal e-Comp Level descriptions Support/Service Concepts/Basic principles Consulting IS strategy/holistic solutions. IS strategy or programme management. Typical tasks Structured – predictable Structured – unpredictable Unpredictable – unstructured Complexity Conceiving, transforming, innovating, finding creative solutions by application of a wide range of technical and/or management principles Behaviour 54 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 der routine supervision. Uses minor discretion in resolving problems or enquiries. Works without frequent reference to others Responds to instructions. Works un- Uses discretion in identifying and resolving complex problems and assignments. Specific instruction is usually given and work is reviewed at frequent milestones. Determines when problems should be escalated to a higher level Works under general supervision. Applying, adapting, developing, deploying, maintaining, repairing, finding basic/simple solutions, Scheduling, organising, integrating, finding standard solutions, interacting, communicating, working in team Designing, managing, surveying, monitoring, evaluating, improving, finding non-standard solutions Planning, making decisions, supervising, building teams, training people, reviewing Interprets instructions; makes performances, finding choices. Works under broad direccreative solutions by tion. Full accountability for own application of specific technical work or project/supervitechnical or business sory responsibilities. Receives assignments in the form of objectives. knowledge/skills Establishes own milestones, team objectives and delegates assignments. Work is often self-initiated including technical, financial and quality aspects. Establishes organisational objectives and delegates assignments. Accountability for actions and decisions taken by self and subordinates Has defined authority and responsibility for a significant area of work, rity and responsibility for all aspects of a significant area of work, including policy formation and application. Is held fully accountable for actions taken and decisions made, both by self and subordinates Makes personal choices : Has autho- Autonomy 7 Annexes Annex 3: Table 3 “Correspondence model EQF, e-CF levels from The User guidelines for the application of the European e-Competence Framework (CEN 2008 and 2010)” 7 Annexes Annex 4: Table 4 “Evidence Assessment Grid” (being adopted by Lombardy Region) EVIDENCE ASSESSMENT GRID Indicator n 3 2 1 3 2 3 3 Note 1 Score* Evidence n … … Indicator c Evidence 3 Indicator b Evidence 2 Indicator a Evidence 1 Competence * completely satisfied (3)/on average (2)/not satisfied (1) Scores highlighted in red (1 and 2) are not fully satisfactory, so they need an investigation in the final assessment stage. During the evidence assessment the EQF indicators need to be satisfied also (please see Annex 5: Table 5 “EQF indicators related to the Evidence Assessment Grid”). If they are not satisfied, a more thorough analysis is needed during the final assessment phase. Brief report on the evidence assessment 55 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 7 Annexes Annex 5: Table 5 “EQF indicators related to the Evidence Assessment Grid” (being adopted by Lombardy Region) EQF INDICATORS RELATED TO THE EVIDENCE ASSESSMENT GRID Guideline for those responsible for the certification process, to evaluate the coherence of the application with the competences to be certified, e.g. AT EQF LEVEL 5. EQF dimensions Indicators for competences at Level 5 Score Context complexity The context is unstable and unpredictable. The project or the reference activities have different objectives at different levels. YES NO Activity typologies The activities require understanding, discretion, problem-solving abilities. The activities require theoretical thought in addition to practical skills. YES NO Autonomy The activities require the ability to carry out activities alone, without external help, even though under supervision, and anunderstanding of which tools and resources are to be used. YES NO Such indicators will be used during both the evaluation of evidence and the final assessment. The aim is to confirm competence at Level 5. In fact, this was evaluated at the process outset and based only on the candidate‘s CV and project/activity overview. 56 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 7 Annexes Annex 6: Table 6 “Oral Examination Grids” (being adopted by Lombardy Region) Oral Examination Grids Indicator 1 Competence 1 = not evident; not shown 2 = scarcely evident, scarcely shown 3 = evident, shown 4 = absolutely evident, absolutely shown Indicator 2 Indicator 3 Indicator 4 Indicator n 1 = not evident; not shown 2 = scarcely evident, scarcely shown 3 = evident, shown 4 = absolutely evident, absolutely shown 1 = not evident; not shown 2 = scarcely evident, scarcely shown 3 = evident, shown 4 = absolutely evident, absolutely shown 1 = not evident; not shown 2 = scarcely evident, scarcely shown 3 = evident, shown 4 = absolutely evident, absolutely shown 1 = not evident; not shown 2 = scarcely evident, scarcely shown 3 = evident, shown 4 = absolutely evident, absolutely shown Note To recognize and certify a competence, it is necessary that the score for each indicator be „evident, shown“ or „absolutely evident, absolutely shown“. Furthermore, the EQF indicators must be confirmed (please see Annex 7: Table 7 “EQF Indicators related to the Oral Examination Grids”). Where just one indicator gets a lower score, the candidate must repeat the certification process, in accordance with the certification body and the assessor, following a guidance interview. Brief report on the final assessment 57 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 Note 7 Annexes Annex 7: Table 7 “EQF Indicators related to the Oral Examination Grids” (being adopted by Lombardy Region) EQF INDICATORS RELATED TO THE ORAL EXAMINATION GRIDS Guideline for those responsible for the certification process, to evaluate the coherence of the application with the competences to be certified, e.g. AT EQF LEVEL 5. EQF dimensions Indicators for competences at Level 5 Score Context complexity The context is unstable and unpredictable. The project or the reference activities have different objectives at different levels. YES NO Activity typologies The activities require understanding, discretion, problem-solving abilities. The activities require theoretical thought in addition to practical skills. YES NO Autonomy The activities require the ability to carry out activities alone, without external help, even though under supervision, and an understanding of which tools and resources are to be used. YES NO Such indicators will be used during both the evaluation of evidence and the final assessment. The aim is to confirm competence at Level 5. In fact, this was evaluated at the process outset and based only on the candidate‘s CV and project/activity overview. 58 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 7 Annexes Annex 8: FORM A “Master template for allocating EQF level and learning category weighting to OVERALL QUALIFICATION” Panel Reference: ALLOCATION OF EQF LEVEL AND LEARNING CATEGORY VALUES The comments are important for the audit trail. The Panel assigning values need to have a common understanding of what these terms mean to them. Learning category EQF Level (best fit) Learning category weighting (percentage) Comments about why the level (and learning category weighting value) has been allocated in this way Knowledge Knowledge and understanding Skills Application and action Competence Autonomy and accountability Overall EQF level Other comments: 59 | 60 VALEW Model Guidelines – How to validate learning at work July 2010 7 Annexes Annex 9: FORM B “Working Template for allocating EQF level and learning category weighting to INDIVIDUAL UNITS” Panel Reference: Name: Qualification/Unit under review: The Panel assigning values need to have a common understanding of what these terms mean to them. Learning category Level (best fit) Learning category Comments weighting about why the level (and (percentage) learning category weighting values) have been allocated in this way Knowledge Knowledge and understanding Skills Application and action Competence Autonomy and accountability Overall level Other comments: 60 | 60 VALEW Model Guidelines – How to validate learning at work July 2010
© Copyright 2024