Department of Defense

  • NSFAdmin | Posted: Apr 20, 2012 Improving Army Basic Research: Report of an Expert Panel on the Future of Army Laboratories
  • [scisip] Globalization of S&T: Key Challenges Facing DOD

    Submitted by Anonymous on

    The National Defense University has published a new report by Drs. Tim Coffey and Steve Ramberg of the Center for Technology and National Security Policy. Excerpts from the report include the following: - DOD's ability to maintain an authoritative awareness of S&T developments around the world will become increasingly problematic. - DOD will not have the fiscal resources to buy its way out of these problems by funding its own standalone program that is large enough to maintain insight into the global S&T program or to play catch up to a foreign effort that has gotten ahead. - For DOD to succeed, it will be necessary to find a means to tap the knowledge of the larger U.S. S&T community regarding global S&T. - The DOD S&T workforce must be plugged into the national S&T community broadly (and to the extent possible into the global S&T community). - To accomplish this, the DOD in-house S&T workforce must be widely recognized for its contributions to the nation's S&T program. - To make the most effective use of the DOD S&T workforce, it will be necessary to employ emerging tools for technology forecasting (TF) and data mining. For more about the report, see The report itself can be downloaded at ######################################################################## To send to the list, address your message to: To subscribe to the list: send the text “subscribe SCISIP” to To unsubscribe:sent the text “unsubscribe SCISIP” to
    02.13.2012 - 02.13.2012

    [scisip] Seminar on the Science of Science and Innovation Policy

    Submitted by Anonymous on

    Seminar on the Science of Science and Innovation Policy

    What: Seminar on the Science of Science and Innovation Policy

    What’s the basis for DOD science and technology policy making? Is it “faith” based? Is it empirically based? Is it rational? What can be learned from examining the basis of non-DOD science and technology policy?

    To help address these questions, especially the last, National Science Foundation program director Dr. Julia Lane1,2 has agreed to present a seminar on the NSF’s Science of Science and Innovation Policy program3  and the STAR METRICS initiative.4

    The NSF’s Science of Science and Innovation Policy (SciSIP) program supports research designed to advance the scientific basis of science and innovation policy. Research funded by the program thus develops, improves and expands models, analytical tools, data and metrics that can be applied in the science policy decision making process.3

    The STAR METRICS initiative--Science and Technology for America's Reinvestment: Measuring the Effect of Research on Innovation, Competitiveness and Science--is a multi-agency venture led by the National Institutes of Health, the National Science Foundation (NSF) and the White House Office of Science and Technology Policy (OSTP). The goal of the initiative is to help the federal government document the value of its investments in research and development. 4

    During the seminar, Dr. Lane will discuss the

      • SciSIP program and its genesis,
      • the high points of what’s been learned over 5 years of funding,
      • the interagency group and how agencies have tried to use insights provided by the research,
      • development of a common empirical infrastructure which is STAR-METRICS, and
      • the vision for the next five years.

    When:   Monday, February 13, 2011         9:30-10:30 a.m.

    Where:  National Defense University         Fort Lesley J. McNair         Lincoln Hall, Room 1119         Washington, DC 20319         See maps on next page

    Who: The seminar is open for all to attend who are interested or involved in science and technology policy making.

    How: No reservations are required. Please contact Jim Garcia at or 202-685-3336 if you have questions.

    Fort Lesley J. McNair, Washington D.C.

    The National Defense University is located on the grounds of historical Fort Lesley J. McNair in Southwest Washington, D.C., between the Anacostia River and the Washington Channel.

    The Main Gate is open 0600-1700, Monday through Friday, closed on Federal Holidays. The Visitors Gate (2d Street) is open 24 hours. The Pedestrian Gate near the Coast Guard Headquarters Building and at the end of 2d Street is open Monday through Friday, 0600-1800, closed on Federal Holidays. Photo identification is required at all gates. Pedestrians can access Fort McNair through the main gate with proper photo identification.


    To send to the list, address your message to:

    To subscribe to the list: send the text “subscribe SCISIP” to

    To unsubscribe:sent the text “unsubscribe SCISIP” to

    [scisip] SOSP Weekly Update

    Submitted by Thornhill.Jennifer on

    Happy Friday.  Below are some of the most recent items posted to the SOSP Website ( 


    Title Source Posted On Minerva Research Initiative  DOD Calls for Proposal and Papers  The Governance of Innovation and Socio-Technical Systems: Theorising and Explaining Change  Jean Monnet Centre of ExcellenceSponsor: Department of Business and Politics at the Copenhagen Business SchoolSponsor: FUHU, Denmark  Calls for Proposal and Papers  NSF SBE Minority Postdoctoral Research Fellowships (MPRF) Program  NSF Calls for Proposal and Papers ANDSciSIP Students Innovation and Productivity  NBER Publications and Documents NSF's Struggle to Articulate Relevance  Science Magazine Publications and Documents Innovations in Nanotechnology at the Nanoscale Science and Engineering Centers and National Nanotechnology infrastructure Network: Highlights of Achievements  NSF Publications and Documents Does Seating Location Impact Voting Behavior on Food and Drug Administration Advisory Committees?  American Journal of Therapeutics Publications and Documents


    Two reminders:

    1)      NSF has moved the SciSIP listserv over to a new listserv system.  Please send all future messages to

    2)      Please remember to post your items of interest to the website.  I will try to include all newly posted items in my weekly update.  If you have difficulties posting to the site, please email me. 






    Jennifer Thornhill

    Science Analyst

    Directorate for Social, Behavioral and Economic Sciences

    National Science Foundation

    4201 Wilson Boulevard

    Arlington, VA 22230

    Phone: 703-292-7273


    ---To send to the list, address your message to: 

    To subscribe to the list: send the text “subscribe scisip” to

    To unsubscribe:  sent the text “unsubscribe scisip” to

    Historical information about listserv communications are here



    [scisip] Re: R&D Dashboard

    Submitted by Anonymous on


    SciSIP:I'd like to join the chorus here stating that the R&D Dashboard's language be revised to reflect the discussion that has been raging on this listserv and to add to the excellent critiques that have been presented here.The patents are indeed troubling for a variety of reasons.  First and foremost is that there has been plenty of research (works by Keith Pavitt comes to mind) that show that patents only cover a very limited amount industries.  Most notably absent from the patent information is anything that has to do with the intangible economy which includes most of the service sector part of the economy.  Second is the relationship between NSF and patenting needs to be clarified.  NSF is chartered to fund fundamental research in the non-medicalscience, mathematics and engineering fields.  Patenting one could argue takes plae more in the "D" of R&D and is more aligned with agencies that fund development such as NASA, DoE and DoD.  In fact, after random sampling some of the patents listed in the R&D dashboard that are supposedly NSF related I find no mention of NSF within the patent itself, only in the dashboard does it state something vague such as "2 degrees" or "funded PI".  I am not even sure what 2 degrees means and as to funding the PI, that may or may not have anything to do with the patent.  Many researchers work on multiple projects with multiple funders.  It could be very likely that the patenting activity took place on a totally non-NSF related project.My last critique is a note of caution in that one along the lines of Joseph Lane reasoning which is that one needs to be careful in how one selects which data is best used to judge aparticular agency.  It has been argued in the sociology of science literature by scholars as Michel Callon and others that scientists are entrepreneurs that adapt to game the system in which they reside to maximize funding for their projects.  If patenting is seen as a measure of success, you risk many scientists flooding the USPTO with patents on a variety of topics which may or may not be patentable but at its worse could stop the free flow of information that i commonly shared in the scientific community.That being said, the dashboard does provide an excellent tool for exploring funding and relating this to publications, which judging by the responses so far, no one is arguing is not a valid output to be measuring for NIH and NSF.  Excellent work getting this site up!Ryan ZelnioScientometrician Contractor for NSWC-Dahlgren
    From: "Lane, Joseph" <>To: Science of Science & Innovation Policy <>Sent: Tue, February 15, 2011 1:01:53 PMSubject: [scisip] Re: R&D Dashboard SciSIP:

    There are many valid points raised here regarding distinctions between inputs, processes and outputs, as well as the neglected areas of outcomes and impacts. As Dan said, the “black box” of innovation persists, yet it should not if all sectors can step back and take more of a global view.


    I suggest that the narrative introducing the R&D Dashboard be revised to accurately represent what it is, and not claim to be what it is not.  The narrative should specifically eliminate the rhetoric regarding innovation and impact because, as others have noted, the contents do not represent either innovation or impact.  Five comments on that point:


    The R&D Dashboard is an artifact of its creator’s system – Federal agencies funding scientific research and engineering development.  Itdoes not provide information about the downstream innovation system.   The input, process and output data from R&D do not track or constitute a measure of “progress in innovation.”  That phrase is inaccurate and misleading so it should be removed.


    The R&D Dashboard is that Federal system’s perspective regarding R&D – the linear model of science push where explicit R&D investment implicitly begets market innovation.  It is not an open system model of market pull, nor does it contain evidence of need for conceptual discoveries published or tangible inventions patented.  The knowledge base generated through the model of science push is akin to an ever expanding haystack, with a particular needle of innovation increasingly difficult to find. 


    The R&D Dashboard is the data available topopulate it - inputs (expenditures, locations, institutions, topic areas);   processes (project and patent abstracts); and the outputs of R&D (publications from research & patents from development).  It is not a compilation of innovations and impacts.  In a logic model framework, the outputs from basic research may become inputs to the global knowledge base, while the outputs from applied research become the inputs to development.  When the opportunity for innovation arises, the innovator draws upon the knowledge base and the practice base.  Innovation may be incremental or disruptive.  Some may occur serendipitously, while most is systematic and deliberate.  None of these dynamic elements are (or could be) contained in this repository of data.


    The R&D Dashboard is a quantification of local economic gain from the infusion of public funds at thelevel of a university, community or state.  This infusion of funding may be considered a net gain and may have a multiplier effect locally, but the source is public coffers not new value added.  It is not a measure of impact in any social, economic or technological sense nationally or globally.  Such downstream impacts result from outcomes in the form of technologically or procedurally advanced goods and services in the marketplace – derived from government and university efforts but delivered by industry efforts.  These goods and services generate new value in the form of revenue drawn from customers, and expenditures allocated to supplies.


    The R&D Dashboard is a welcome and important catalogue of investment and activity growing the knowledge base across all fields of science and engineering.   It is not a listing of requirements for new scientific discoveries,or specifications for new engineering inventions.  Such a listing from the market pull side would provide targets at which the sponsoring agencies and sponsored investigators could aim their applied R&D – which would not detract from the presence of the knowledge repository generated by basic research.


    The narrative for the R&D dashboard could be improved in three ways:


    First, eliminate references to “innovation” or to “impacts.”  It is disingenuous to suggest that the data represent “progress in innovation.”  Conceptual discoveries from research and tangible inventions from development are both necessary antecedents, but together are insufficient to constitute innovation in the sense suggested – innovation in the context of domestic or global markets for goods and services.    The R&DDashboard is – and only is – a very slick way to track the allocation of public funds to support research [and development?] activities in the U.S..  Similarly, the R&D Dashboard data permit users to map the level of Federal R&D investment, but the metrics are unrelated to any social-economic impacts.


    Second, carefully reconciling terms to increase congruence between what is described and what the R&D dashboard contains.  For example, the phrase “research & development” and the acronym “R&D” are used, but  the narrative does not articulate “development” as an activity, the relationship between research activity and development activity, nor how the R&D dashboard differentiates between research projects and development projects, in terms of inputs, processes or outputs.  Further, if the data reflect activities within science andengineering, then the latter term should be added and explained.


    Third, add a qualifier, presenting the limitations of what the R&D Dashboard contains, as well as the opportunities for expanding the system to link to market requirements for science and engineering. For example, a complimentary “Innovation Dashboard” would list the unresolved technological issues facing industry to reflect the market pull forces ready and willing to uptake and apply the outputs from the R&D Dashboard.  These outputs would still represent basic and applied research, as well as engineering development, but could be searched and linked in more sophisticated ways to potential users on both the supply and demand sides of the knowledge system.


    Clarifying the terminology and delineating what we have here as contributions to innovation, allowsus to clearly define what else we are missing and need to create, in order to link knowledge supply and knowledge demand – in all three states of discovery, invention and innovation – to achieve the domestic and international policy goals expressed.


    Warm regards to all involved in this important topic.



    Joseph P. Lane, University at Buffalo ; 


    From: Carmine Basile[] Sent: Tuesday, February 15, 2011 12:38 PMTo: Science of Science & Innovation PolicyCc: Science of Science & Innovation PolicySubject: [scisip] Re: R&D Dashboard



    The market potential of some inventions are often just not exploited, and additionally there are no univocal definition of Innovation in literature. I will be prudent excluding patents as proxy of innovation.





    2011/2/15 Loet Leydesdorff <>

    SciSIP: Dear colleagues, It is long accepted in the literature that patents are a proxy for inventions, not for innovation. Innovation is defined including market introduction (or mutatis mutandis in non-market selection environments.) Inventions can be more or less science-based and this is often researched in terms of non-patent literature references in patents. One can also look from the other side and analyze the quality of, for example, university patents. There is considerable debate about their quality possible (because they are "not-invented-here" from the perspective of the entrepreneur). Best wishes, Loet

    On Tue, Feb 15, 2011 at 3:28 PM, Zak Taylor <> wrote:

    SciSIP: I do research on long-run technology innovation at the national level. While I always attempt to triangulate by using as many innovation measures as possible, my single favorite remains technology patents per capita, weighted by forward citations. Certainly patents have flaws as innovation measures. But patents per capita (weighted by forward citations) correlate highly with other measures which we generally associate with aggregate innovation rates, including S&T research publications, GDP growth, manufacturing growth, exports of capital goods, R&D spending, capital formation,Nobel Prize winners, etc.  Perhaps a simple litmus test of the appropriateness of patents is that one cannot find a technologically innovative country which is not relatively well represented by its aggregate patent data; even the Soviet Union during its period of isolation from the West regularly patented at a rate roughly representative of its overall relative technological prowess. Yes, patents have problems as empirical measures, especially at the micro-level, but when aggregated at the national level and used in large time periods, I'd argue that they generally provide a statistically representative sample of a nation’s innovation rate.

    Mark Zachary TaylorAssistant ProfessorSam Nunn School of International AffairsGeorgia Institute of Technology781 Marietta Street NWAtlanta, GA


    Jeffrey Alexander wrote:

    SciSIP: This is absolutely a central question for the community on this mailing list.  I would challenge the list members to offer proposals for what alternative measures of outputs might be captured and displayed by such a tool, as Gerald has done.  Patents are, in general, AN output of research activity, but are not claimed to be a comprehensive measure of all research outputs, nor are they a reliable measure of the quality or ultimate impact of the research activity (the same can be said about publications).  However, as an intermediate measure, they provide some information about research activities on a broad basis, where none has been available in the past.Iwould also point out that data quality is at least as prominent a concern as data selection.  It was interesting to observe that a number of NIH grants in my state of Maryland have a value of $1.  Since I'm somewhat familiar with the intricacies of federal accounting systems, I understand why these grants show up this way, but to the uninformed user, it may appear that a $1 research grant can produce outputs that are equivalent to a $1 million research grant.  Despite flaws such as this, I'm reluctant to argue that an interesting tool with informative implications should be jettisoned because it is not perfect.For disclosure purposes--I am not a SciSIP PI and have not been involved in developing this dashboard, but I do have relationships with people who were involved in the development effort.-jeffOn 2/14/2011 11:48 PM, Ferreras, Ana M. wrote:


    Dear Gerald,


    I am glad you brought this up.  I have been watching very close this accountability and transparency movement going on in DC, where individuals without experience in this area or with just few years of experience in evaluation and assessment have been placed to develop monitoring tools such as dashboards and scorecards.  Get ready to see more reports coming out using performance measures, indicators, and metrics from the XVI Century.  It's very sad...





    From: Gerald Barnett []Sent: Monday, February 14, 2011 9:52 PM

    To: Science of Science & Innovation PolicySubject: [scisip] Re: R&D Dashboard



    Perhaps someone could explain why patent applications and patents are counted as outputs of research.  I understand how an *invention* is a research output, as is a discovery or a data set or a software code, but theconditions on which a patent is applied for have little to nothing to do with research or with extramural research funding.  It is an institutional decision.  I don't see how a patent application means anything (it may never issue for reasons that have nothing to do with the invention), and issued patents are a function of an institution's patenting budget and (generally) its lack of selectivity.  For all that, an unworked patent may stand as an obstacle to further research (especially by industry), university-industry collaboration, and domestic investment  (anyone outside the territory of the patent can practice without taking a license, paying royalties).  Could someone point to a reasoned argument why patents rather than, say, reports of subject inventions and their first commercial sale or use (a metric also required by BD) ought to show up as a primary metric in a federal R&D dashboard?  


    Gerald Barnett 

    Director, RTEI 

    University of Washington 


    ----- Original Message -----

    Sent: Monday, February 14, 2011 3:57 PM

    Subject: [scisip] R&D Dashboard





    I'd like to draw your attention to a proptotype R&D Dashboard released by OSTP for comment last week. 



    Much of the work was done by SciSIP funded PIs, and SciSIP partnered with OSTP in developing the prototype.


    Comments very welcome (the comment email info is on the About tab)




    Julia LaneProgram DirectorScience of Science & Innovation PolicyCurrent solicitation:


    Jeffrey Alexander, Ph.D.

    Senior Science & Technology Policy Analyst

    Center for Science, Technology & Economic Development


    SRI International

    1100 Wilson Boulevard, Suite 2800

    Arlington, VA  22209-2268

    -- Prof. Loet Leydesdorff Amsterdam School of Communications Research (ASCoR)Kloveniersburgwal 48, 1012 CX AmsterdamTel.: +31-20- 525 6598; fax: +31-20- 525 3681 ;


    RE: FW: Re: [SIGMETRICS] The European Research- Conundrum: when research organizations impede scientific and technological- breakthroughs despite targets, money and policy to foster these activities

    Submitted by NSFAdmin on

    Sent by: David Wojick Fred raises an important issue when he refers to "a Science of Science and Innovation Policy, the SciSIP which we are all working on, and different from the Science of Science Policy." I have thought of SciSIP as spanning all of both science policy and innovation policy, but the alternative is just to focus on the intersection. In US DOE this is called Basic-Applied Coordination. In DOD it is called Transitioning. The question is how science comes to affect technology and commerce, and what policies can improve this process? There is a great deal of work to be done here. In many cases the scientific community resents being pressured for applications, at the expense of understanding. We call it the basic-applied divide.See: Wojick, Ph.D.391 Flickertail Lane Star Tannery VA USAOct 30, 2009 06:30:53 AM, erberndt@MIT.EDU wrote:===========================================SciSIP:Thanks much, Fred. This is very helpful.ErnieFrom:[] Sent: Friday, October 30, 2009 4:06 AMTo: Science of Science & Innovation PolicySubject: [scisip] FW: Re: [SIGMETRICS] The European Research Conundrum:when research organizations impede scientific and technological breakthroughsdespite targets, money and policy to foster these activitiesI have been following this discussion on innovation definitionsand indicators with some interest as the OECD brought together the definitionsof innovation for statistical purposes for the first time in 1992 afterextensive testing in a number of countries including the U.S. That was thefirst Oslo Manual. The Oslo Manual is now in its 3rd edition and is usedin 27 European countries, 30 OECD countries, 53 members of the African Union,in Latin America and the Caribbean, and in China. The indicators in Europederive from the Community Innovation Survey (CIS) and can be found on theEurostat website or the websites of statistical offices in most Member States.In the U.S., the NSF has launched the Business R&D and Innovation Survey(BRDIS) and should have results within the year, following the Oslo Manual. TheManual can be downloaded from the OECD website ( the citation is: OECD/Eurostat (2005), Oslo Manual, Guidelines forCollecting and Interpreting Innovation Data, Paris: OECD. The 3rdedition is closely aligned with Schumpeter as it deals with product and processinnovation, organizational change and business practices, and marketdevelopment.In 2006, there was the OECD Blue Sky Forum II, supported by NSF,which looked at new indicators and new uses for old indicators, including thosefor innovation. The book that emerged from that Forum is: OECD (2007), Science,Technology and Innovation Indicators in a Changing World: Responding to PolicyNeeds, Paris: OECD. It is available as a PDF on the website or it can bepurchased in hard copy.The chapter to read in the Blue Sky II book is the one byArundel on the problems of using innovation indicators for public policy.Innovation indicators have a way to go before they are as well known as GDP andemployment indicators. While looking at the book, this community might find thechapter by John Marburger of interest where he cogently puts the case for aScience of Science and Innovation Policy, the SciSIP which we are all workingon, and different from the Science of Science Policy.There are various handbooks on the subject of Innovation, andits measurement. The most accessible, in my view, is: Fagerberg, Jan, David C.Mowery and Richard Nelson (eds.) (2005), The Oxford Handbook of Innovation,Oxford: Oxford University Press. Bronwyn Hall and Nathan Rosenberg plan tobring out the Handbook of Innovation in 2010 with Elsevier.Finally, if this is not a burning issue, those who havecontributed to this discussion might find the book, Innovation Strategiesfor a Global Economy, Development, Implementation, Measurement and Management,of interest. It will be published by E. Elgar in July 2010 just after the OECDreleases is Innovation Strategy which has been noted in the recent exchange ofemails. The details are attached.Fred.Date: Thu, 29 Oct 2009 20:27:41 +0100Subject: [scisip] Re: [SIGMETRICS] The European Research Conundrum: when researchorganizations impede scientific and technological breakthroughs despitetargets, money and policy to foster these activitiesFrom: basile.carmine@gmail.comTo: scisip@lists.nsf.govCC:;; katy@indiana.eduSciSIP: I would turn the question, regarding a definition of innovation, askingif is not still actual the one provided by Shumpeter: "Innovation is anirreversible process ofcreative destruction". I still trust inthat. I would also add that in the direction of a more centralized europeandecision power, reside the recent estabilishment of an European Institute ofTechnology. At any rate, in my consideration,the cultural differencesthat distinguish the European Union States don't permit a total centralizationof research policy and money fluxes, as by an European organization. I stronglybelieve that in the case of an efficient intelligent system, this could get ina strong point. I conclude saying that indicators field is rapidly and continuously on therate. BestCarmine 2009/10/29Dan Stokols SciSIP:Katy,the work of the Center for Innovation at the Univ. of Maryland may be ofinterest to you:, DanAt9:21 AM -0400 10/29/09, Katy Borner wrote:SciSIP:Very interesting discussion below.I would argue that the NSF Science of Science and Innovation Policy (SciSIP)program in the US does focus on both: science and innovation.I am still looking for a good definition of innovation and a reviewof major innovation indicators. Pointers are very welcome.kLoet Leydesdorff wrote:Adminstrative info forSIGMETRICS (for example unsubscribe): Chris,The situation is very interesting. National research councilstraditionally organize the power of the scientific elite (Mills,Mulkay) and given the subsidiarity principle this power cannot betaken away easily by a European organization. The EU therefore in the1980s decided to focus not on science, but on innovation (JaquesDelors). The Framework Programmes were defined in terms of theprecompetitive technosciences. This terrain was yet unoccupied bynational research councils.With the shift of attention to science as central to theknowledge-base of an economy (e.g., the US program SciSIP, but mainlyChina), this arrangement may have to be revised (for economicreasons). Thus, we are witnessing in my opinion a power strugglerather than a conundrum. At issue is who controls the allocation ofresearch funds and to which extend: national research councils or theEU?Best wishes,LoetOn Wed, Oct 28, 2009 at 9:24 PM, Armbruster, Chris wrote: Adminstrative info for SIGMETRICS(for example unsubscribe): colleagues,Please find the abstract and the link to a new working paper on theEuropeanResearch Conundrum. Comments are welcome. I would be interested to hearfromcolleagues interested in this issue.Armbruster, Chris, The European Research Conundrum: when researchorganizations impede scientific and technological breakthroughs despitetargets, money and policy to foster these activities. (October 27, 2009).Available at SSRN: European Research Conundrum may be described thus: In the interest ofthe European Research Dream, the structure and culture of the researchorganization should be adapted to the mission of achieving scientific andtechnological breakthroughs but, alas, this mission is first overwhelmedandthen deformed by the existing structure and culture of the organization.Theconundrum has been highlighted publicly by the high-level review of theEuropean Research Council (ERC), which "found fundamental problemsrelatedto rules and practices regarding the governance, administration andoperations of the ERC that are not adapted to the nature of modern'frontier' science management." The organization threatens to defeatthemission, even though the ERC is new, corresponds to targets, and is wellfunded.This paper advances three arguments. Firstly, the prevalent focus ontargets, money and policy is criticized because it does little to bringabout the required organizational restructuring while allowing theorganization to overwhelm the mission, thus threatening a lock-in of ERA assecond rate. Secondly, it is shown that it is known what kind oforganizational design is conducive to scientific and technologicalbreakthroughs and that this knowledge could be utilized to drive forwardorganizationalrestructuring. Thirdly, some practical suggestions are madehow to gather empirical evidence about barriers and challenges in theEuropean Research Area by tracking the experience of grantees of Europeanflagship programmes in a multiple case-study design, which may be extendedto innovation systems.To also speak to those who think that targets, money and policy shouldremain the focus, the research may be designed in a fashion thataccommodates alternative and competing hypotheses as to what is conducivetoor impedes scientific and technological breakthroughs and innovationssystems.KeywordsScientific breakthroughs, technological inventions, innovation systems,European Research Area, European Research Council, scientific excellence,research university, research funding, research policy, R&D targetsChris ArmbrusterExecutive Director, Research Network 1989 and working papers available in Open Access BornerVictor H. Yngve Professor of Information ScienceDirector, CI for Network Science Center, http://cns.slis.indiana.eduCurator, Mapping Science exhibit, http://scimaps.orgSchool of Library and Information Science, Indiana UniversityWells Library 021, 1320 E. Tenth Street, Bloomington, IN 47405, USAPhone: (812) 855-3256Fax: -6166---You are currently subscribed to scisip as: dstokols@uci.eduTo unsubscribe send a blank email to $subst('Email.Unsub')To send emails to the listserv, email are currently subscribed to scisip as: basile.carmine@gmail.comTo unsubscribe send a blank email to $subst('Email.Unsub')To send emails to the listserv, email You are currently subscribed to scisip as: Tounsubscribe send a blank email to $subst('Email.Unsub') To sendemails to the listserv, email Lotsof fantastic offers on Windows 7, in one convenient place. Get a deal onWindows 7 now---You are currently subscribed to scisip as: dwojick@hughes.netTo unsubscribe send a blank email to $subst('Email.Unsub')To send emails to the listserv, email

    Energy Innovation at DOD

    Submitted by NSFAdmin on

    Sent by: Daniel Sarewitz --_002_C8EE40642E268dsarewitexchangeasuedu_Content-Type: text/plain; charset="iso-8859-1"Content-Transfer-Encoding: quoted-printableSciSIPPERS:Along with John Alic I am working on a project aimed at better understandin=g Department of Defense innovation processes, and their potential applicati=on to energy technology innovation.A brief project description is attache=d.We are looking for people who have an in-depth, granular understanding of v=arious aspects of DoD's innovation capabilities, from R&D through procureme=nt, deployment, and operations, potentially to participate in the project i=n various ways-as interviewees, white-paper authors, and/or workshop partic=ipants.I'd be grateful to hear from anyone who has worked in this area, or who has= suggestions of people we could contact.To minimize traffic on the list, =please respond to me directly.Many thanks,Dan Sarewitz--_002_C8EE40642E268dsarewitexchangeasuedu_Content-Type: application/octet-stream;

    Re: US government accounting for R&D as share- ofeconomic growth

    Submitted by NSFAdmin on

    Sent by: "Merrill, Steve" Steve:The Academies recently completed an extensive evaluation of the fiv=e largest SBIR programs -- DOD, DOE, NSF, NIH, and NASA -- as well as a cro=ss-agency review with recommendations. All are accessible at work continues on selected issues of impact and administration.Steve MerrillDirector, STEP =20-----Original Message-----From: Fiore, Steve []=20Sent: Tuesday, November 02, 2010 9:55 AMTo: Science of Science & Innovation PolicySubject: [scisip] Re: US government accounting for R&D as share ofeconomic =growthSciSIP:On a related note, does anyone know of any analyses of the federal governme=nt's SBIR/STTR programs?At a minimum, I'd be curious to see what percenta=ge has successfully transitioned to actual products (and by industry) but, =ideally, whether or not there have been any analyses of broader impacts fro=m this investment in small business innovation.Thanks,Steve Fiore________________________________________From: Regets, Mark []Sent: Tuesday, November 02, 2010 9:16 AMTo: Science of Science & Innovation PolicySubject: [scisip] Re: US government accounting for R&D as share ofeconomic =growthSciSIP:Thanks Carol,I think we are saying the same thing. In no way do I want to imply that t=he R&E satellite accounts are not a good thing to do.The macroeconomic =importance is huge. And R&D measurement and better understanding of what =is measured is obviously important to attempts to measure overall societal =returns to R&D.Mark________________________________________From: Robbins, Carol []Sent: Tuesday, November 02, 2010 8:36 AMTo: Science of Science & Innovation PolicySubject: [scisip] Re: US government accounting for R&D as share of economic= growthSciSIP:SciSIP,Mark's point about the information needed about R&D for science policymakes sense.There is however, substantial value to accuratemeasurement of direct intangible investment in a national accountingframework.BEA's work will lead to measures of investment and R&D stockthat align with and impact internally consistent statistics of national,industry, and state level Gross Domestic Product. That said, the issue of spillovers does require a different kind ofanalysis, and the measure of depreciation that reflect these spilloversor externalities would indeed be different.BEA's accounts measure thedepreciation from the perspective of the owner of the R&D, thus itrepresents obsolescence and the loss of the owner's ability to capturethe economic benefits.In BEA's 2007 Survey of Current Business article on the SatelliteAccount, the topic was addressed this way:"The estimates provided in this release include only the direct impactof R&D investment, that is, the direct benefit realized by the investor.These estimates do not separately identify spillovers, the benefits ofR&D to firms that did not pay for the R&D. However, the Bureau of LaborStatistics (BLS) produces measures of the impact of technological changeon productivity as part of its estimates of multifactor productivity forthe business sector. These estimates measure spillovers directly. BLShas estimated that approximately one-fifth of the multifactorproductivity residual can be attributed to R&D in recent years. TheseBLS estimatesof the spillovers are broadly consistent with the BEA estimates of thedirect impact of R&D. For more information, see."A set of papers at BEA on R&D can be found here: RobbinsBureau of Economic Analysis202-606-9923-----Original Message-----From: Regets, Mark []Sent: Monday, November 01, 2010 2:23 PMTo: Science of Science & Innovation PolicyCc: scisip@lists.nsf.govSubject: [scisip] Re: US government accounting for R&D as share ofeconomic growthSciSIP:I have to agree with Gordon, but perhaps make it stronger.WhetherR&D is counted like investment in the National Accounts will make adifference to the measured growth rate of GDP, and hence tomacroeconomic policy. But it in no ways tell us anything useful aboutthe contribution of R&D to GDP except in the most trivial sense of theaccounting relationship (GDP=3D C + I + G + X).This is of interestto macroeconomic policy, but the main relevance of this to sciencepolicy may be that the added attention to R&D may lead tobetterestimates of it (a important matter to be sure).Mark Regetsmregets@nas.edupermanent email: markregets@scipolicy.com________________________________________From: Caroline Wagner []Sent: Monday, November 01, 2010 12:01 PMTo: Science of Science & Innovation PolicyCc: scisip@lists.nsf.govSubject: [scisip] Re: US government accounting for R&D as share ofeconomic growthSciSIP: Hi Gordon and colleagues,Thanks for these excellent points.I am attaching the original BEAreport, 2007, on how the Dept of Commerce is calculating R&Dcontributions to the GDP satellite account. This may have more specificinformation in it that addresses the issues you note.Caroline.On Nov 1, 2010, at 11:51 AM, Gordon Reikard wrote:The paper in the previous link is Dennis Fixler, Accounting for R&D inthe National Accounts, presented at the ASSA meetings in January 2009.Unfortunately, it does not do what it purports to do, and leaves thequestion of the contribution of R&D largely unresolved. Reading thispaper, it seems to wander ambiguously through some of the productionfunction literature, omitting many of the key references, withoutreaching any definite conclusions.There are really two issues in including for R&D in the national incomeaccounts. The first is whether to include private sector R&D in GDP,under business fixed investment. At the current time, private sectorR&D is treated as an intermediate input, and excluded. Paradoxically,Federal government spending is included in GDP, under governmentpurchases. If private sector R&D were included, business investmentwould be higher, and as a result, GDP would also be revised upwardhistorically.The second if of course the much more difficult issue of including R&Din the production function. This paper does not estimate a productionfunction, despite the fact that there are studies going back more than20 years that do. Instead, it merely reports the fact that BEA and BLShave come up with widely different estimates of the R&D stock.According to this paper, BEA estimates that in 2002 the total R&D stockwas only $931 billion, of which $581 billion was comprised by privateindustry. Presumably, this estimate is in nominal dollars. The BLSestimate for the private industry R&D stock for the same year was $1,295billion.Nevertheless, both estimates can be seen as implausibly low. BEAassumes very high rates of depreciation. This paper is rather vague onexactly how this is calculated, but other BEA documents indicate that 15percent geometric was used. This assumption is at best completelyarbitrary, and ignores much of the literature (see in particular BronwynHall's recent paper on measuring depreciation).BLS assumes nodepreciation of basic research, but 10 percent annual depreciation ofdevelopment and applied research. By comparison, if R&D is assumed notto depreciate, the corresponding values would be a stock of $2,649billion in industry-funded R&D in 2002, and $2,095 funded by othersources.The reason that these estimates are important of course is that the sizeof the R&D stock influences the elasticity of R&D in the productionfunction, and in turn, the contribution of R&D to output. Given howlow the BEA estimates of the R&D stock are, it is perhaps fortunate thatFixler's paper did not estimate a production function. The resultingestimates for the impact of R&D would be far below what has beenpublished in the academic literature.________________________________ Oct 31, 2010, at 1:19 PM, Ervin, Dana Miller wrote:SciSIP:Can anyone refer me to a recent study (post Solow) that shows thecontribution of growth in knowledge to US output?ThanksDana Ervin---You are currently subscribed to scisip as: MRegets@nas.eduTo unsubscribe send a blank email to $subst('Email.Unsub')To send emails to the listserv, email scisip@lists.nsf.govTo subscribe to the listserv, send a blank email toSUBSCRIBE-SCISIP@LISTS.NSF.GOV---You are currently subscribed to scisip as: Carol.Robbins@bea.govTo unsubscribe send a blank email to $subst('Email.Unsub')To send emails to the listserv, email scisip@lists.nsf.govTo subscribe to the listserv, send a blank email toSUBSCRIBE-SCISIP@LISTS.NSF.GOV---You are currently subscribed to scisip as: MRegets@nas.eduTo unsubscribe send a blank email to $subst('Email.Unsub')To send emails to the listserv, email scisip@lists.nsf.govTo subscribe to the listserv, send a blank email to SUBSCRIBE-SCISIP@LISTS.=NSF.GOV---You are currently subscribed to scisip as: sfiore@ist.ucf.eduTo unsubscribe send a blank email to $subst('Email.Unsub')To send emails to the listserv, email scisip@lists.nsf.govTo subscribe to the listserv, send a blank email to SUBSCRIBE-SCISIP@LISTS.=NSF.GOV---You are currently subscribed to scisip as: SMerrill@nas.eduTo unsubscribe send a blank email to $subst('Email.Unsub')To send emails to the listserv, email scisip@lists.nsf.govTo subscribe to the listserv, send a blank email to SUBSCRIBE-SCISIP@LISTS.=NSF.GOV
    Subscribe to RSS - Department of Defense