Benchmarking National - PRO INNO Europe
Benchmarking National - PRO INNO Europe
Benchmarking National - PRO INNO Europe
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
62<br />
BENCHMARKING NATIONAL AND REGIONAL SUPPORT SERVICES FOR SMES IN THE FIELD OF INTELLECTUAL AND INDUSTRIAL <strong>PRO</strong>PERTY<br />
order”, to “personal initiative”, to “[usage of learning] experiences gained through<br />
the activities of the whole organisation”.<br />
The extent to which user needs have been actually scrutinised may also be subject<br />
to discussion: In some instances, the consultation of focus groups was involved; in<br />
others “...demand was clear form the number of questions the ‘parent service’ received<br />
about IPR” (service provider). For many older services, the way the services came<br />
into existence was not even traceable.<br />
Quality assurance<br />
Graph 10 provides an overview on the type of quality assurance mechanisms<br />
employed, differentiated by services in the benchmarking phase and by services<br />
which were actually selected as case studies for phase 3 of the underlying research,<br />
the good practice analysis. As can be seen, a rather large share of services (23 %)<br />
has no quality assurance mechanisms in place. The majority of the services (59 %)<br />
conduct regular monitoring exercises, under which activities such as the collection<br />
of feedback forms or reporting activities to the funding organisation (e.g., yearly<br />
reports) are summarized. “Other” quality assurance mechanisms (such as working<br />
groups with customers) are implemented in 35 % of the services in the<br />
benchmarking phase. Overall, only half of the services have formal evaluations<br />
conducted (interim, ex-post evaluation or regular audits). In addition, evaluations<br />
seem to be conducted less frequently on services from the patent offices than on<br />
those from other types of organisations. Against the backdrop that the services<br />
selected for benchmarking already present the better performing ones, this result<br />
may thus indicate a lack of evaluation culture in the IPR-for-SMEs service world.<br />
One can observe that services that are evaluated tend to perform, on average,<br />
better than non-evaluated ones. The services selected as case studies for presenting<br />
good practice elements have, on average, tighter quality assurance mechanisms in<br />
place than the benchmarked ones.<br />
Not using evaluations on the IPR services analysed seems to have implications<br />
especially in terms of accountability and customer orientation – the latter opposed<br />
to the service provider’s self-perception. In the first case, it is questionable whether<br />
the funding bodies of the services actually do have all information necessary to<br />
gauge performance. In other cases, it seems that the knowledge of the service<br />
providers about their customers may be limited. Even with some case study<br />
services, it was difficult to obtain large enough contact databases which contained<br />
all necessary contact information as well as information on the types of customers<br />
(SMEs, patent attorneys, large enterprises, etc.). Data protection issues play a role,<br />
but they seem to be only part of the story.<br />
Graph 10 Quality assurance mechanisms in place, percentage of services*) **)<br />
%<br />
70<br />
60<br />
50<br />
40<br />
30<br />
20<br />
10<br />
0<br />
50<br />
59<br />
Regular<br />
monitoring exercises<br />
36<br />
47<br />
Interim<br />
evaluations<br />
29<br />
35<br />
Ex-post<br />
evaluations<br />
24<br />
Regular<br />
audits<br />
35 35<br />
31<br />
Other quality<br />
assurance<br />
mechanisms<br />
Benchmarked services “Good Practice” elements exhibiting services<br />
23<br />
12<br />
No quality<br />
assurance<br />
mechanisms<br />
*) Multiple counts allowed<br />
**) Ex-ante evaluations would in the strictest sense also be part of quality assurance mechanisms, but are discussed for better<br />
readability as part of the preparatory activities (see Graph 9).<br />
Source: <strong>Benchmarking</strong> process, n (benchmarked services) = 66, n (case study services) = 15