Complementarities in Innovation Strategy and the Link to Science Post author By Philip Lane Post date October 21, 2009 This CREI report by Bruno Cassiman is an accessible overview of this link between scientific research and innovation. Categories In Economic Performance Tags Irish science policy 15 Comments on Complementarities in Innovation Strategy and the Link to Science ← Peter Mathews Talk Tonight → Academic Cassandras: Iceland and Ireland 15 replies on “Complementarities in Innovation Strategy and the Link to Science” Well written and well informed. A convincing reminder that nothing is simple about the smart economy, and that simple policies are unlikely to work. Thanks for the link/comment Philip and Richard. Hope to look at the paper soon. It is right down my alleyway I should think. Richard, are you going to the Spirit of Ireland lecture at RDS at 6.00pm tomorrow? Seamus Garvey’s excellent presentation about offshore wind and storage is on Engineers Ireland TV webcast to view. “We need a better understanding of the innovation process at the micro level in order to foster innovation at the firm level and develop the correct policy measures at a more aggregate level.” The issue for Ireland is what is required to motivate multinationals to locate significant research operation in Ireland. Would most significant research likely be kept in the home country? As regards the Irish focus of univsersity research on high tech and biotechnology, areas where Irish firms Irish firms have a patch performance, I published an article last month arguing teh the main focus should be on our area of strength – food. Germany long an agricultural importer became a net food/beverage exporter in 2008 with a rise of 15% in exports. http://www.finfacts.ie/irishfinancenews/article_1017985.shtml Philip Thanks for the link. It came at an opportune time for me because I shared the outward and return flight to Brussels last Sunday/Monday with a former, very research-active scientist from UCD. He had read some of the negative comments by Irish economists about the value of scientific R&D to the economy. He found these disheartening because – as he put – why should he devote so much time and energy to chasing big grants and recruiting new staff to UCD to participate in large-scale projects if there is little or no benefit to the economy or society? He would much rather apply for a small grant of, say, 250,000 thousand euro to allow him to do the work he is really interested in and spent his Sunday evenings at home! And, he noted, it seemed to him that many of his colleages in some other University Schools follow this strategy, which allows them time to pursue what outsiders would deem to be academic hobbies. Of course, the quick answer is that his status and promotion etc depend on getting big grants. But is this a mistaken policy at university level (both in Ireland and internationally)? I had no answers for him, but I have forwarded to him the link to the Cassiman document. @Brendan: I hope that status and promotion at UCD do not depend on getting big grants! In serious universities worldwide, research and promotion depend on teaching and research performance, where the latter is measured by the quality of your publications (outputs) rather than by the amount of money you raise (inputs). Sometimes the inputs matter in getting the outputs, i.e. in subjects where the costs of research are high. Sometimes you can do great work for no money at all. In many fields 250,000 is a lot of money. It seems almost tragic to me if serious scholars are not doing the work they are really interested in because of a desire to bring in big grants. Not sure that all serious academic institutions worldwide are that high minded: “Academic institutions have lost their way in using big federal grants as a marker of individual academic success, confusing institutional advancement with advancement of science.” http://www.acphysci.com/aps/resources/PDFs/APS_0909_Skipjack.pdf The Cassiman article is a great find. Thank you Philip. Has anyone ever worked this back to first principles? What I mean is, why did civilisation invent science to begin with? Why do we need science? Speaking from my own experience, working in the field of architecture and design, I know we are impoverished in that field. Science apart from anything else, was a way in which many, many people could combine efforts in some way and be more productive. Because, all of a sudden there was a framework set up whereby knowledge could be shared, captured, refined and so on. I made some sharp remarks to Karl Whelan in an earlier thread about ‘closed cultures’. What I really mean, is professional traditions that suffer, because they do not have what science has – this idea of the value of sharing thoughts and ideas from the best brains in society, or in the world, at any one time and over time. Alan Kay gets into this discussion in a short video link here: http://www.youtube.com/watch?v=mG1w9VkvLdU Before we begin any discussion about science, I think we should peel it back to the core. What are the basis benefits to human kind from having science in the first place. I actually envy the engineering tradition myself, even though architectural designers are not supposed to envy engineers. But engineers do benefit from this link back to science. I don’t know about economists – I believe that economists want to benefit from a ‘scientific’ basis and foundation to what they do. It would be great if it worked, but at the end of the day perhaps economics is best kept closer to politics and humanities. It is worth looking for a couple of videos by Richard M. Stallman too. Stallman grew up immersed in campus life himself. But points to the problems of science today, where mathematical equations, ancient food recipes, you name it are starting to be fenced in and called ‘intellectual property’. Yochai Benkler, Jamie Boyle and the Duke School of Law people have some interesting view from a legal perspective, of what they call the second enclosure movement – the fencing off of the knowledge commons. Bruce Perens, part of the open source software movement is really funny on this subject at times also. On the one hand, universities have the capacity to cause untold trouble for large corporations, because between them, staff and students at universities can introduce innovation which is released into the public domain. On the other hand, companies can throw money at universities in a attempt to buy them out and ensure that knowledge stays ring fenced, whatever it is. It may be just a simple equation. Even the young scientist competition at the RDS has been high jacked in recent years, by the large companies. I don’t know where you all stand on that. I remember Enterprise Ireland got some bad press at the time, but now the flood gates are opened, and kiddies from schools are being approached by the big players at the fair. Whatever way you look at it, you have to look at this from a legal and intellectual property point of view. Attempting to justify the possible, as opposed to the probable, economic outcome of research is a somewhat thankless task. Not all research activities are ‘equal’. There’s long-term strategic, and short-term commercial, and all sorts of in-betweens. Academic, ie. third-level, is part long-term and part in-between, (mostly). It is impossible to ascribe any benefit (economic, financial or social), until some time after the research has been completed and the results have percolated into the scientific and engineering communities. So any debate, discussion or whatever about the benefits is a waste of time. You either believe yes, or no. Its a matter of faith. I have grave misgivings about the possible waste of taxpayer’s euros that are ladled out to our third-level institutions. I suspect, though I have only anecdotal evidence about this, that the majority of the projects are self-serving – to amass publications to ascend the greasy pole of academia. Veblen had it nailed – conspicuous consumption for decorous emulation! If I had the authority to allocate funding to third-level institutes I would split it 75/25 in favour of improving the quality of teaching at undergraduate level in these institutes. Some of the undergraduate level teaching is of a deplorable standard. This is completely unacceptable. But, I suppose its a bit like the two tailors, Kelly and Lynch. Never Mind the Quality (of the teaching), Just Look at that Pile of Publications!! B Peter This Cassiman Report looks good – and will read in more detail later on …… in the meantime to add to the complexity see below ……. Science and the Corporate Agenda: the detrimental effects of commercial influence on science and technology Launched: 12 October 2009 by Scientists for Global Responsibility Abstract: It is no secret that links between the commercial sectors and science and technology are increasing. Many policy-makers, business leaders and members of the science community argue that this is positive for both science and society. But there is growing evidence that the science commercialisation agenda brings with it a wide range of detrimental effects, including bias, conflicts of interest, a narrowing of the research agenda, and misrepresentation of research results. This report takes an in-depth look at the evidence for these effects across five sectors: pharmaceuticals; tobacco; military/defence; oil and gas; and biotechnology. Its findings make disturbing reading for all concerned about the positive role of science and technology in our society. Exec summary and full report …….. http://www.sgr.org.uk/SciencePolicy/CorporateInfluence.html @ B P Woods, Sure, there can be a big drop off in quality of academic programs between secondary and third level. That needs to be addressed properly. A lot of the criticism in recent years has been aimed at the second level education system. How do we want to teach kids, should the way we teach maths be changed etc. However, that only takes on board half of the problem. The problem being, when you get kids up to some standard during the second level program, we should try a lot harder to continue that momentum having built it up, and improve the learning experience at third level. Obviously more cut backs and poorer standards at third level, is going to make things worse. I imagine a lot of universities complain too, they don’t want to be fixing problems that should be tackled at second level – i.e. teaching basics that should have been taught at second level. I don’t know. There are many sides to the debate. One of the best contributions to this debate I ever heard was from Nicholas Negroponte of MIT in his book, Being Digital, where he notes that European students are often over-worked at second level education. By the time they reach third level they are effectively worn out in terms of energy. Negroponte argues that the opposite approach is taken in the US, where students have loads of energy left by the time they get to third level and you can structure the third level program, to channel that energy. Lets be honest about this. After the second level system here in Ireland, which is very good, how much more do students at third level have left in the tank? Then you get to fourth level, and you witness the phenomenon of people who are comfortable at being life long academics. People who simply are able to continue churning out more and more great work, well after others throw in the towel. From my own experience, I don’t know where my momentum ran out. My plan for the last couple of years had been to return to education again, in middle age some time. To focus more on working in the meantime, and bring the benefit of real world experience back into education with me later. The unfortunate thing is, the economy has changed now in Ireland. I don’t know how my skills in the construction sector are applicable anymore going forward. Does one re-skill, or up-skill, or what does one do. Does one move, as has been the more traditional solution, to fit a location where your skills dovetail with opportunities. A very big debate indeed. @ David O’Donnell, You have hit upon a decent point there David, and I would like to follow up the SGR report you have linked. One point which has arisen in debates lately, is the un-predictability of investment in R&D. Namely, that many countries which invested in research for military purposes etc, found their investment bore fruit in the non-military economy also. The thing for sure with military spending, is a strong element of project management comes with it. Personnel get used to dealing with project timelines, which have set targets and goals to achieve. That is, in terms of product output to be achieved. Getting organised propery and coordinating with a team to achieve production outputs – a lot of these things, are skills which help the real economy at the end of the day. I am sure, a lot of these skills fed back into the post WWII economy in the US. @Brian O’Hanlon The military have been in the vanguard of technological innovation, mainly for war, since man first picked up and used a weapon. In the present socio-economic era – this has accelerated – where we now talk about the military-industrial complex. And yes there are spillovers into broader economy – as long as they don’t blow themselves up in the meantime! I’haven’t fully read either yet – but this is a very big question and it will take more than few straight lines to get anywhere near it – hence my inclusion of a more critical side which can be neglected when the only positives measured are on the other side. It is also something we need to become better at if we are to be proven smart enough to thrive and prosper in a sustainable manner using our heads as distinct from our arses. Now if we could put that into an equation – then this would become the first bog to win the Nobel Prize in Economics (-; oops … that last line should read “.. the first blog etc.” @ David O’ Donnell, I read about a company lately, an American company, the exact one slips my mind now and it may not even be in existence still. I am pretty sure it was semi-conductors. The military were one of the early adopters of the semi-conductor technology, because it represented something more durable for them in the field. The old vacuum tubes etc that soldiers carried around were quite suspect. You only have to look at that great old movie, made about operation Market Garden, starring Sean Connery, that went horribly wrong for the British paratroopers in Holland – and the equipment they used was faulty too. The trend regarding war fare it seems to me, since the 1940s is to try to conserve lives of troops and provide the very best of equipment. The driver of that seems to be the sheer political un-popularity of war campaigns in today’s environment, and the need to keep defense forces small and keep casualties to a minimum. Hence the pressure on technology to come up with answers. Is there any other sector, which is putting that much pressure on technology to innovate these days? If you take the point, that in health care for instance, a lot of resources are aimed at developing first world ‘life style’ drugs, rather than solving larger global medical problems. Of course, that all ties back into the issues Pat McArdle mentioned in his article in the business section of the IT. The need to re-balance the world, China increase its demand domestically and the US to become de-leveraged. I guess we are looking at a future, where the Chinese become a larger and ‘different’ market for health care and medicine. I can think of computers and networking – I don’t know if there is that much pressure on technology to solve any critical problems – I mean, like there is in the area of war fare to solve difficult problems. The one area I can think of, which is become a critical driver of technology these days, is the environment and green energy. I know a nice bit now about offshore energy projects. It is driving human beings to work in environments they aren’t used to working in. To solve problems, which they never tackled before. But gettting back to the American company, it gots its big start serving the demand from those early military contracts. That gave the company some breathing space and it got itself firmly established. But it wasn’t stupid either. It knew it had to find some new outlet, the military spending couldn’t continue forever. It was successful in making the transition into other products. I suppose, the arena of war pushes technology in many ways. Lighter, better power consumption, yet stronger and more durable. These always seem to be the ‘budgets’ if not the cost per se. Even guys like Richard Feynman, a great ecologist and naturalist it could be said, started his career with the Manhattan project. It is interesting to note, that IBM’s 360 operating system project in the 1960s cost the same amount of money to develop as the Manhattan project in the 1940s, and it almost broke IBM as a company. It had to devote more and more resources to the operating system project, which threatened to take longer and longer to finish. Many say, it was a testiment to IBM as a company and its management, that they didn’t sink under, as a result of their first unified OS for the company. Certainly, in business today, we see nothing as adventurous as that happening. But it would be important to note, that IBM in those days had ‘X’ amount of market share, but captured up to 90% of profits, as I understand it. It dominated its field. Comments are closed.