MicrowebOrg + openscience   48

The Seven Deadly Sins of Psychology: A Manifesto for Reforming the Culture of Scientific Practice eBook: Chris Chambers: Amazon.de: Kindle-Shop
In this unflinchingly candid manifesto, Chris Chambers draws on his own experiences as a working scientist to reveal a dark side to psychology that few of us ever see. Using the seven deadly sins as a metaphor, he shows how practitioners are vulnerable to powerful biases that undercut the scientific method, how they routinely torture data until it produces outcomes that can be published in prestigious journals, and how studies are much less reliable than advertised. He reveals how a culture of secrecy denies the public and other researchers access to the results of psychology experiments, how fraudulent academics can operate with impunity, and how an obsession with bean counting creates perverse incentives for academics. Left unchecked, these problems threaten the very future of psychology as a science—but help is here.

Outlining a core set of best practices that can be applied across the sciences, Chambers demonstrates how all these sins can be corrected by embracing open science, an emerging philosophy that seeks to make research and its outcomes as transparent as possible.
openscience 
13 days ago by MicrowebOrg
How to start an Open Science revolution! An interview with patient advocate, Graham Steel. – ScienceOpen Blog
When did you first hear about open access/data/science? What were your initial thoughts?

In order, I first heard about open access late 2006, open science the following year and then open data. My initial thoughts were that all these entities were much needed and refreshing alternatives to all that I had seen or read about such topics up until then, i.e., closed access, prohibitive paywalls, “data not shown” etc.

You’re what some people call a ‘Patient Advocate’ – what is that, and what’s the story there?

The terms Patient Advocate and Patient Advocacy broadly speaking can mean a number of things. By definition, “Patient advocacy is an area of lay specialization in health care concerned with advocacy for patients, survivors, and carers”. For myself personally, this began in 2001 and mainly concerned bereaved relatives and then patients and their family members. See here for further details.

You relentlessly campaign for various aspects of open science – what drives you in this?

My means of background, I would say with certainty that during the period of around 2008 – 2011, the (sadly now deceased) social media aggregator site Friendfeed was the space in which the foundations for a lot of my current thinking were set out. Prior to that, having already been primed with open access and open data, that’s pretty much where open science really took off in earnest. Science and indeed research in the open is without question the way forward for all.

What do you think the biggest impediments to open research are? How can we collectively combat or overcome them

First and foremost has to be Journal Impact Factor (JIF). This is despite an abundance of evidence which over the years has shown that this is a highly flawed metric. I would encourage academics to make enquiries within their Institutions to take a pledge and sign the San Francisco Declaration on Research Assessment, DORA. Secondly, as mentioned earlier, embrace the fact that it takes very little effort these days to get a preprint of your work archived on the web.

I would encourage academics to make enquiries within their Institutions to take a pledge and sign the San Francisco Declaration on Research Assessment, DORA

What tools or platforms would you recommend to researchers looking to get into open science?

There are so many these days, where does one start? The best resource out there at present (I am not alone in this view) is Innovations in Scholarly Communication (now available in seven languages) created by Bianca Kramer and Jeroen Bosman. Also see https://innoscholcomm.silk.co/ which is super awesome.

Where do you see the future of scholarly communication? What steps are needed to get there? Whose responsibility do you think it is to lead this change?

I don’t have the answers to those myself. As of the time of writing, I would highly recommend Open Science Framework. I am moving more and more in the direction of advocating preprints for any paper with optionally, publication in journals later.
openaccess  openscience 
4 weeks ago by MicrowebOrg
[no title]
Those in the humanities often champion collaboration and the open exchange of ideas, but you wouldn't necessarily know that when you look at the venues they use to share their work. Hybrid Pedagogy seeks to challenge not only how humanists teach, but also how they publish. The six-year-old online journal pursues humanistic values by embracing an editorial process well-established in the sciences: open peer review.

Hybrid Pedagogy’s editorial process is uncommonly inclusive. Whereas many academic journals prize selectivity, Hybrid Pedagogy accepts the vast majority of submissions—about 70 percent, according to the current editor—with the expectation that authors and reviewers work hand-in-glove to revise essays. The resulting articles are short (by academic standards), visually engaging, widely circulated, and more personal and political than those in traditional academic publications.

While print journals embrace open peer review—as is the case with STEM journals such as Atmospheric Chemistry & Physics and PeerJ—the pairing of open peer review with web technology enables new editorial approaches. In a Views article for Inside Higher Ed, Alex Mueller, associate professor of English at the University of Massachusetts in Boston, wrote that combined with open access, open peer review can support new forms of scholarly inquiry.

Such methods have long proliferated in the sciences. For years, physicists have used arxiv.org, the physics pre-print repository, to perform pre-publication review, said Cheryl Ball, editor of the web-text journal Kairos: A Journal of Rhetoric, Technology, and Pedagogy and an associate professor at West Virginia University.
openscience  open:peerreview  bnbuch:openscience 
4 weeks ago by MicrowebOrg
Experiment - Offenes Promotionsvorhaben - eingereichte Version - Eingereicht
Die Arbeit orientierte sich dabei an der Forderung von Open Science, dass der umfassende Zugriff auf den gesamten wissenschaftlichen Erkenntnisprozess inklusive aller Daten und Informationen, die bereits bei der Erstellung, Bewertung und Kommunikation der wissenschaftlichen Erkenntnisse entstanden sind, jederzeit gegeben ist. Auf der Webseite http://offene-doktorarbeit.de wurde zu jeder Zeit der gesamte Text sowie die verwendete Literatur, aber auch die Ergebnisse der empirischen Arbeit zeitnah veröffentlicht.
openscience  christianheise  offenedoktorarbeit 
7 weeks ago by MicrowebOrg
About – Open Science Commons
The Open Science Commons (OSC) is a new approach to sharing and governing advanced digital services, scientific instruments, data, knowledge and expertise that enables researchers to collaborate more easily and be more productive.

Within the OSC, researchers from all disciplines will have easy, integrated and open access to the advanced digital services, scientific instruments, data, knowledge and expertise they need to collaborate and achieve excellence in science, research and innovation.

Using Open Science as a guideline and applying the Commons as a management principle will bring numerous benefits for the research community, and society at large.

More about the benefits of the Open Science Commons to research

The Open Science Commons builds on the idea of an e-Infrastructure Commons, first proposed in a White Paper published in 2013 by the European e-Infrastructure Reflection Group (e-IRG).

The Open Science Commons relies on four pillars, representing a wide range of groups, providers and community types:

Data. The data that is the subject matter for research. It should be dealt with according to the principles of open access and open science, while maintaining trust and privacy for researchers.
e-Infrastructures. The technology and technical services supporting researchers, building towards integrated services and interoperable infrastructures across Europe and the world.
Scientific instruments. The equipment and collaborations which generate scientific data, from small-scale lab machines to global collaborations around massive facilities.
Knowledge. The human networks, understanding and material capturing skills and experience required to carry out open science using the three other pillars.
openscience 
10 weeks ago by MicrowebOrg
Our Mission | ORCID
Unique Identifier für Forscher

ORCID’s vision is a world where all who participate in research, scholarship, and innovation are uniquely identified and connected to their contributions across disciplines, borders, and time.
Our mission

ORCID provides an identifier for individuals to use with their name as they engage in research, scholarship, and innovation activities. We provide open tools that enable transparent and trustworthy connections between researchers, their contributions, and affiliations. We provide this service to help people find information and to simplify reporting and analysis.
openscience 
10 weeks ago by MicrowebOrg
Projektbeschreibung • OJS-de.net
Das Ziel des Projekts ist, die elektronische Publikation wissenschaftlicher Zeitschriften an deutschen Hochschulen auf Basis von OJS zu erleichtern, auszubauen und langfristig zu sichern. Das Projekt umfasst Softwareanpassung, Bedarfsanalyse, Aufbau eines OJS-Netzwerks und die Steigerung der Sichtbarkeit von OJS-Journals.
openscience  openjournal 
10 weeks ago by MicrowebOrg
Humanities Commons – Open access, open source, open to all
Yes, members can create multiple WP sites (for conferences, journals, courses, etc.) on HC. We have plugins for SlideShare, Soundcloud, etc.

Welcome to Humanities Commons, the sharing and collaboration network for people working in and around the humanities. Discover the latest open-access scholarship and teaching materials, make interdisciplinary connections, build a WordPress Web site, and increase the impact of your work by sharing it in the repository.

Not just articles and monographs: Upload your course materials, white papers, conference papers, code, digital projects—these can have an impact too!
openscience  digitalhumanities  opensyllabus 
10 weeks ago by MicrowebOrg
Self Journals, Open Peer-review!
Michaël Bon1, Michael Taylor2, Gary S McDowell3,4
Novel processes and metrics for a scientific evaluation rooted in the principles of science
Version 1 Released on 26 January 2017 under Creative Commons Attribution 4.0 International License

We propose an implementation of our evaluation system with the platform “the Self-Journals of Science” (www.sjscience.org)

In this system of value creation, scientific recognition is artificially turned into a resource of predetermined scarcity for which scholars have to compete. In one camp, members of the scientific community must compete for limited space in a few “top” journals, which can impede the natural unrestricted progress of science by disincentivizing open research and collaboration. In the other camp, a low number of editors must also contend with each other for exclusive content to increase the reputation of their journal, a process that can have strong negative effects on scientific output and on the research enterprise as a whole. Although many scholars wear both hats –being authors and journal editors at the same time– here we do not identify the problem in individual agents but rather in the roles themselves and the power relationship between them. Thus, we argue that it is not only the kind of value that is promoted by the current system that is questionable (journal prestige and ‘impact', as in impact factor): more importantly, it is the way the system produces value and how its implicit asymmetric power structure is detrimental to scientific progress.

In the current publishing environment, since scientists are competing for the same limited resources, relations between peers can become inherently conflictive. For instance, scientists working on the same topic may tend to avoid each other for as long as possible so as not to be scooped by a competitor, whereas collectively it is likely that they would have benefited most from mutual interaction during the early research stages. The most worrying consequence of peers' diverging interests is that debating becomes socially difficult –if not impossible– in the context of a journal. The rejection and downgrade of an article to a lower-ranked journal can be a direct consequence of a scientific disagreement that few people would openly take responsibility for, to avoid reprisals. While the reliability of science comes from its verifiability, today it is being validated by a process which lacks this very property. Journal's peer-review is not a community-wide debate but a gatekeeping process tied to the local policy of an editorial board

While peer-trial still dominates the mainstream, there are strong signs that the scientific community is actively engaged in a more continuous process of validation. Browsing websites such as PubPeer or Publons (where “post-print peer-review” is possible) makes it clear that, although articles are improved with respect to initial submission, the discussion process continues long after publication and that the evolution of articles is a more dynamic construct [14]. This is at odds with the world of undisclosed email dialogues between authors and editors, and reviewers and editors during the peer-trial process.

In this section, we present a definition of scientific value and describe the open and community-wide processes required to capture it. These processes maintain symmetry in the creation of scientific value and fulfil what we consider the minimal expectations from any desirable alternative evaluation system, which are:

to promote scientific quality.
to provide incentives to authors, reviewers and evaluators.
to promote academic collaboration instead of competition.
to be able to develop in parallel to current journal publication practices (as long these remain essential for funding and career advancement).
to propose article-level metrics that are easy to calculate and interpret.
to be verifiable and hard to game.

A prototype of an evaluation system driven by these processes is implemented in “the Self-Journals of Science” (SJS, www.sjscience.org): an open, free and multidisciplinary platform that empowers scientists to achieve the creation of scientific value. SJS is a horizontal environment for scientific assessment and communication and is technically governed by an international organisation2 of volunteer research scholars whose membership is free and open to the entire scientific community

We have defined scientific peer-review as the community-wide debate through which scientists aim to agree on the validity of a scientific item.

In our system, peer-review is an open and horizontal (i.e. a non-authoritative and unmediated) debate between peers where “open” means transparent (i.e., signed), open access (i.e., reviewer assessments are made public), non-exclusive (i.e., open to all scholars), and open in time (i.e. immediate but also continuous). This brings a new ethic to publishing [31]: the goal of peer-review is not to provide a one-time certification expressed in the form of a binary decision of accept or reject as per the traditional mode of publishing, rather it is to scientifically debate the validity of an article with the aim of reaching an observable and stable degree of consensus. Here, reviews are no longer authoritative mandates to revise an article, but elements of a debate where peers are also equals. The influence of a review over an article is based on its relevance or its ability to rally collective opinion, and on an open context where authors cannot afford to let relevant criticism go unanswered.

The validity of an article is captured by a transparent and community-wide vote between two options: “this article has reached scientific standards” 5 or “this article still needs revisions”.

Self-Journals. In our alternative evaluation system we introduce the concept of self-journals as a way for scientists to properly express their judgement regarding an article's importance for a specific field. A self-journal is a novel means of personal scientific communication; it can be thought of as a scholarly journal attached to each individual scientist that works on the curation of any scientific item available on the Web via hyperlinks (and not on appropriation of articles following a submission process).

A self-journal is released into structured issues, which are collections of articles around a certain topic. Every issue has its own title and editorial providing an introduction for the community and must contain a minimal number of articles (in our implementation, we set this minimum to 4). The curator has the possibility to provide personal comments on each article that has been curated in the issue (for concrete examples, please check the first issue of the self-journal of Sanli Faez, Konrad Hinsen or Michaël Bon). The consistency of the selection of articles and the relevance of the personal comments determine the scientific added value of each self-journal issue. Every scientist can curate their own self-journal, through which they can release as many issues on as may topics as they please. Curators can take advantage of self-journals to review a field, present a promising way to develop it, offer a comprehensive collection of their own research, host the proceedings of a workshop or a journal club, or popularize scientific ideas and discoveries etc. A self-journal reflects the scientific vision of its curator and bears his or her signature. Interested readers can freely subscribe to a self-journal and get notified whenever a new issue is released.

An ecosystem of self-journals offers a way to quantify the importance of an article, primarily by the number of its curators.

Incentives in the absence of official recognition by institutions and funders. Self-journals have their own rationale. Firstly, they are a means of personal scientific communication that allow their curators to elaborate on an individual vision of science with the necessary depth they see fit. A self-journal therefore provides a great scientific service to its readers by providing a level of consistency in the interpretation and analysis of scientific output. In return, this service benefits curators who increase their visibility and influence over what is disseminated to the community. Self-journals give new freedom and scope to the editing process since curation, as proposed here, applies to any research work with an Internet reference. In other words, a mechanism is provided that allows scientists to fully express an aspect of their individual worth that is absent in the current system, and build a reputation accordingly.

A response is also provided to the problem of the decreasing visibility of authors, articles and reviewers as the volume of scientists and scientific works grows on the Web. Each issue of a self-journal acts as a pole of attraction that is likely to have a minimum audience: the authors whose articles have been curated can be notified about what is being said about their work, and may want to follow the curator. Moreover, on a platform like SJS where the ecosystem of self-journals is well integrated, interest for a particular article can guide readers to self-journal issues where it has been uniquely commented on and contextualized in relation to other articles.

We wish to emphasize that the interest value of a particular self-journal issue does not lie so much in the intrinsic value of the articles selected, but rather in the specific comments and collective perspective that is being given to them.
openscience  selfjournal 
11 weeks ago by MicrowebOrg
bjoern.brembs.blog » Open Science: Too much talk, too little action
!! I got involved in Open Science more than 10 years ago. Trying to document the point when it all started for me, I found posts about funding all over my blog, but the first blog posts on publishing were from 2005/2006, the announcement of me joining the editorial board of newly founded PLoS ONE late 2006 and my first post on the impact factor in 2007. That year also saw my first post on how our funding and publishing system may contribute to scientific misconduct.

In an interview on the occasion of PLoS ONE’s ten-year anniversary, PLoS mentioned that they thought the publishing landscape had changed a lot in these ten years. I replied that, looking back ten years, not a whole lot had actually changed:

Publishing is still dominated by the main publishers which keep increasing their profit margins, sucking the public teat dry
Most of our work is still behind paywalls
You won’t get a job unless you publish in high-ranking journals.
Higher ranking journals still publish less reliable science, contributing to potential replication issues
The increase in number of journals is still exponential
Libraries are still told by their faculty that subscriptions are important
The digital functionality of our literature is still laughable
There are no institutional solutions to sustainably archive and make accessible our narratives other than text, or our code or our data

The only difference in the last few years really lies in the fraction of available articles, but that remains a small minority, less than 30% total.

So the work that still needs to be done is exactly the same as it was at the time Stevan Harnad published his “Subversive Proposal” , 23 years ago: getting rid of paywalls. This goal won’t be reached until all institutions have stopped renewing their subscriptions. As I don’t know of a single institution without any subscriptions, that task remains just as big now as it was 23 years ago. Noticeable progress has only been on the margins and potentially in people’s heads. Indeed, now only few scholars haven’t heard of “Open Access”, yet, but apparently without grasping the issues, as my librarian colleagues keep reminding me that their faculty believe open access has already been achieved because they can access everything from the computer in their institute.

there can be no dispute that now a lot more people are talking about these issues. Given perhaps another 23 years or 50, there may even be some tangible effects down the road – as long as one assumes some sort of exponential curve kicking in at some point fairly soon. It sure feels as if such an exponential curve may be about to bend upwards. With the number of Open Science events, the invitations to talk have multiplied recently

open text, data, code:

We’ve already started by making some of our experiments publish their raw data automatically by default. This will be expanded to cover as many of our experiments as technically feasible. To this end, we have started to work with our library to mirror the scientific data folders of our harddrives onto the library and to provide each project with a persistent identifier whenever we evaluate and visualize the data. We will also implement a copy of our GitHub repository as well as our Sourceforge code in our library, such that all of our code will be archived and accessible right here, but can be pushed to whatever new technology arises for code-sharing and development. Ideally, we’ll find a way to automatically upload all our manuscripts to our publication server with whatever authoring system we are going to choose (we are testing several of them right now). Once all three projects are concluded, all our text, data and code will not only be open by default, it will also be archived, backed up and citable at the point of origin with a public institution that I hope should be likely to survive any corporation.
openscience 
11 weeks ago by MicrowebOrg
Zenodo - Research. Shared.
Zenodo in a nutshell

Research. Shared. — all research outputs from across all fields of research are welcome! Sciences and Humanities, really!
Citeable. Discoverable. — uploads gets a Digital Object Identifier (DOI) to make them easily and uniquely citeable.
Communities — create and curate your own community for a workshop, project, department, journal, into which you can accept or reject uploads. Your own complete digital repository!
Funding — identify grants, integrated in reporting lines for research funded by the European Commission via OpenAIRE.
Flexible licensing — because not everything is under Creative Commons.
Safe — your research output is stored safely for the future in the same cloud infrastructure as CERN's own LHC research data.

all research outputs from across all fields of research are welcome! Zenodo accepts any file format as well as both positive and negative results. We choose to promote peer-reviewed openly accessible research, and we curate the uploads posted on the front-page.
Citeable.
Discoverable.
— be found!

Zenodo assigns all publicly available uploads a Digital Object Identifier (DOI) to make the upload easily and uniquely citeable. Zenodo further supports harvesting of all content via the OAI-PMH protocol.
Community
Collections
— create your own repository

Zenodo allows you to create your own collection and accept or reject uploads submitted to it. Creating a space for your next workshop or project has never been easier. Plus, everything is citeable and discoverable!

twitter 1/2017
Thomas Robitaille @astrofrog
Just uploaded 33Gb of data to @ZENODO_ORG in 20 minutes. Mind blown!
openscience 
january 2017 by MicrowebOrg
Der ‚goldene Weg‘ zu Open Science – scilog
Das Einzigartige an OLH ist das Finanzierungsmodell. In vielen Gesprächen mit Bibliotheken in den Jahren 2013 und 2014 erkannten wir, dass viele bereit waren uns zu helfen einen anderen Veröffentlichungsmodus auf die Beine zu stellen, der nicht profitorientiert und für geisteswissenschaftliche Fächer nachhaltiger war als das APC-Modell. So führten wir die “Library Partnership Subsidy” (LPS) ein. – Anstatt Geld von Bibliotheken über ein Subskriptionsmodell zu verlangen, zahlen die uns fördernden Institutionen in einen „Kostenpool“ ein, aus dem wir die Infrastruktur für unsere Veröffentlichungsplattform finanzieren und mit dem wir Produktionskosten wie Lektorat, Schriftsatz, digitale Archivierung, etc. bestreiten. Als wir das System im September 2015 einführten, hatten uns bereits fast 100 Bibliotheken aus den USA, Großbritannien und Europa ihre Unterstützung zugesagt.

Das Einzigartige an OLH ist das Finanzierungsmodell. In vielen Gesprächen mit Bibliotheken in den Jahren 2013 und 2014 erkannten wir, dass viele bereit waren uns zu helfen einen anderen Veröffentlichungsmodus auf die Beine zu stellen, der nicht profitorientiert und für geisteswissenschaftliche Fächer nachhaltiger war als das APC-Modell. So führten wir die “Library Partnership Subsidy” (LPS) ein. – Anstatt Geld von Bibliotheken über ein Subskriptionsmodell zu verlangen, zahlen die uns fördernden Institutionen in einen „Kostenpool“ ein, aus dem wir die Infrastruktur für unsere Veröffentlichungsplattform finanzieren und mit dem wir Produktionskosten wie Lektorat, Schriftsatz, digitale Archivierung, etc. bestreiten. Als wir das System im September 2015 einführten, hatten uns bereits fast 100 Bibliotheken aus den USA, Großbritannien und Europa ihre Unterstützung zugesagt.

Ursprünglich wollten wir ein sogenanntes Megajournal einrichten, in dem eine große Anzahl von Artikeln aus allen geisteswissenschaftlichen Fächern veröffentlicht werden sollte. Gleichzeitig sollte eine Reihe von unterschiedlichen Overlay Journals aufgebaut werden, um der Leserschaft zu erlauben, das veröffentlichte Material im Rahmen von einzelnen Forschungsfeldern zu ordnen. Obwohl es das Megajournal noch gibt, ist es inzwischen nur eines einer ganzen Reihe von Fachzeitschriften auf unserer Plattform.

In unseren laufenden Gesprächen mit einer Reihe von akademischen Herausgebern wurde uns klar, dass die Geisteswissenschafterinnen und Geisteswissenschafter ihre Bindung an eine bestimmte Zeitschrift oder Marke, aber auch an die Forschungsgemeinde, die die Zeitschriften im Lauf der Jahre aufgebaut haben, nicht aufgeben wollen. Wenn wir also einen Großteil der Forscherinnen und Forscher nicht überzeugen konnten, diese Bindung aufzugeben und mit dem OLH Megajournal einen ganz neuen Weg einzuschlagen, konnten wir sie vielleicht dazu überreden, der OLH-Plattform indirekt über ihre Zeitschriften beizutreten – also nicht einzelne Forscherinnen und Forscher, sondern ganze Communities in Richtung Open Access zu bewegen.

Deswegen ermöglichten wir es Zeitschriften, OLH beizutreten und die Vorteile unserer technologischen Innovationen und unseres APC-freien Modells zu genießen, ohne deswegen ihren Namen und die redaktionelle Unabhängigkeit aufgeben zu müssen

Im Moment arbeiten wir daran, ein halbjährliches Einreichungsverfahren für Zeitschriften einzurichten. Seit dem Start unserer Plattform im September 2015 scheinen viele unserer Kolleginnen und Kollegen dieses Projekt sehr viel schneller vorantreiben zu wollen, als wir ursprünglich angenommen hatten. Es gibt eine sehr dynamische Unterstützungsbewegung für das OLH-Modell. Und zurzeit arbeiten wir mit Partnern in Europa zusammen, besonders in den Niederlanden, die sehr daran interessiert sind, Open Libraries für andere Bereiche wie zum Beispiel die Mathematik oder Technik einzurichten.

Mehrere Studien zeigen, dass die Grundprinzipien des Open-Acces-Publizierens, besonders in den Geisteswissenschaften, eine überwältigende Anhängerschaft haben. Wenn es dann an die Praxis geht, sind Forscherinnen und Forscher in den Geisteswissenschaften aber zögerlicher als andere Disziplinen.

Eine der Hauptmotivationen für die Gründung von OLH war die Erkenntnis, dass die akademischen Hierarchien immer stärker werden. Es gibt viele Menschen, die zwar aktiv Forschung betreiben, aber keine Festanstellung haben oder auf Werkvertragsbasis arbeiten. Ohne die Vorteile einer fixen Hochschulanstellung haben sie dann oft keinen Zugang zu kostenpflichtigen Publikationen von Forschungsergebnissen. Auch Absolventen haben nach ihrem Abschluss häufig keinen Zugang zu wissenschaftlichem Material, sobald sie nicht mehr über einen Universitäts-Account verfügen. Das hindert sie daran, ihre Studien auch außerhalb der Universität weiterzuführen.

Außerdem ist uns klar, dass es andere gesellschaftliche Bereiche gibt, wie etwa NGOs, Standesvertretungen oder sogar Politiker, die aus beruflichen Gründen Zugang zu akademischer Forschung benötigen. Wenn all diese Gruppen keinen Zugang zu wissenschaftlichen Veröffentlichungen haben, macht es die Gesellschaft insgesamt ärmer.
openaccess  geiwi  openscience 
january 2017 by MicrowebOrg
Wissenskommunismus und Wissenskapitalismus
Kommunismus, Universalismus, Desinteressiertheit und Organisierter Skeptizismus
merton  openscience  grassmuck 
january 2017 by MicrowebOrg
bjoern.brembs.blog » So your institute went cold turkey on publisher X. What now?
With the start of the new year 2017, about 60 universities and other research institutions in Germany are set to lose subscription access to one of the main STEM publishers, Elsevier. The reason being negotiations of the DEAL consortium (600 institutions in total) with the publisher. In the run-up to these negotiations, all members of the consortium were urged to not renew their individual subsc
bjoernbrembs  openaccess  openscience 
december 2016 by MicrowebOrg
What is the future of Open Education?
From Open Education to Open Science

Fifteen years ago MIT took a big leap by introducing OpenCourseWare. In the intervening years, many universities have followed their steps in the world of Open Education.

In 2007 the Delft University of Technology launched their OpenCourseWare website. In 2010 we shared our course materials through iTunesU, and in 2013 we joined edX to publish openMOOCs.

The first ten years were mostly focused on the creation of more open resources. Over the last five years, the focus has shifted towards adoption. We are concentrating on the move from Open Educational Resources (OER) to Open Educational Practice (OEP).

The US converged towards a specific part of OER, Open Textbooks, and has had a lot of success with this strong focus on cost savings for students. In Europe the focus is diverging towards open science, which is a much broader process of opening up universities.
OpenScience

Often OpenScience is defined as the combination of Open Source, Open Data, Open Access, Open Education and more. More importantly, it is the movement to make scientific research, data, and dissemination (including education) accessible to all levels of an inquiring society, amateur or professional (Wikipedia, 2016). OpenScience is much more of a change in behavior than the adoption of a tool. For the European Commission, OpenScience, along with Open Innovation and Open to the World, are priorities for the next couple of years (European Commission, 2016).

European Commission (2016). Open innovation, Open Science, open to the world. A vision for Europe. Brussels: European Commission, Directorate-General for Research and Innovation. ISBN: 978-92-79-57346-0 DOI: 10.2777/061652. Available at: http://bookshop.europa.eu/en/open-innovation-open-science-open-to-the-world-pbKI0416263/
openscience  mikrobuch:uni20  oer:star5 
november 2016 by MicrowebOrg
Hypothesis | The Internet, peer reviewed. | Hypothesis
Our mission is to bring a new layer to the web. Use Hypothesis to discuss, collaborate, organize your research, or take personal notes.

Open Scholarly Annotation
jonudell  hypothes_is  mikrobuch:open  openscience 
november 2016 by MicrowebOrg
Blogsterben |
Vielleicht liegt das ja nur in meinem persönlichen Umfeld, aber meine Annahme auf Grundlage meiner (unsystematischen) Beobachtung ist: Als Wissenschaftler bloggen und folglich über Dinge berichten, die man für mitteilungswürdig hält, erste Ideen oder interessante Fundstücke teilen und öffentlich reflektieren über das, was einen bewegt, scheint keine Konjunktur mehr zu haben – von Ausnahmen, nämlich der „Eröffnung“ neuer Blogs wie den von Tobias, mal abgesehen ;-). Blogs einzelner Wissenschaftler werden tendenziell eher eingestellt oder sind verwaist oder auf Links und Wiedergaben von Inhalten ohne eigene (nennenswerte) Kommentierung reduziert. Wo sind die Meinungen, die Positionen, die Kritik? Und was sind die Gründe für das Blogsterben? Keine Zeit (mehr), weil man die man für Forschungsanträge und Administration braucht? Kein unmittelbarer Gewinn für die eigene Arbeit, ohne den es nicht mehr geht? Angst vor Kommunikationsabteilungen, die das gar nicht gerne sehen, wenn nicht alle kommunikative Energie in die PR der Organisation fließen? Sorge gar, die Unileitungen könnten sich an der öffentlich geäußerten Meinung ihrer Wissenschaftler stoßen?
openscience 
october 2016 by MicrowebOrg
» Speculation: Sociality and “soundness” ((vs. Excellence -- "Triftigkeit" als sozialer Effekt)) . Is this the connection across disciplines?
Björn Brembs • 21 days ago
Isn't it ironic that more than a decade after social media arrived to us from outside scholarship, the scholars who were among the most early adopters of social media for scholarship are re-discovering just how social scholarship is? :-)

At least for me, these insights are somewhere between "well, duh" and "this should have been on everybody's minds at least 15 years ago!"

NEYLON:

what it was that distinguishes the qualities of the concept of “soundness” from “excellence”. Are they both merely empty and local terms or is there something different about “proper scholarly practice” that we can use to help us.

At the same time I’ve been on a bit of a run reading some very different perspectives on the philosophy of knowledge (with an emphasis on science). I started with Fleck’s Genesis and Development of a Scientific Fact, followed up with Latour’s Politics of Nature and Shapin and Schaeffer’s Leviathan and the Air Pump, and currently am combining E O Wilson’s Consilience with Ravetz’s Scientific Knowledge and its Social Problems. Barbara Herrnstein Smith’s Contingencies of Value and Belief and Resistance are also in the mix. Books I haven’t read – at least not beyond skimming through – include key works by Merton, Kuhn, Foucault, Collins and others, but I feel like I’m getting a taste of the great divide of the 20th century.

I actually see more in common across these books than divides them. What every serious study of how science works agrees on is the importance of social and community processes in validating claims.

In the Excellence pre-print we argued that “excellence” was an empty term, at best determined by a local opinion about what matters. But the obvious criticism of our suggesting “soundness” as an alternate is that soundness is equally locally determined and socially constructed: soundness in computational science is different to soundness in literature studies, or experimental science or theoretical physics. This is true, but misses the point. There is an argument to be made that soundness is a quality of the process by which an output is created, whereas “excellence” is a quality of the output itself. If that argument is accepted alongside the idea that the important part of the scholarly process is social then we have a potential way to audit the idea of soundness proposed by any given community.

If the key to scholarly work is the social process of community validation then it follows that “sound research” follows processes that make the outputs social. Or to be more precise, sound research processes create outputs that have social affordances that support the processes of the relevant communities. Sharing data, rather than keeping it hidden, means an existing object has new social affordances. Subjecting work to peer review is to engage in a process that creates social affordances of particular types.

More social” on its own is clearly not enough. There is a question here of more social for who? And the answer to that is going to be some variant of “the relevant scholarly community”. We can’t avoid the centrality of social construction, because scholarship is a social activity, undertaken by people, within networks of power and resource relationships.
openscience  soundness  bjoernbrembs 
october 2016 by MicrowebOrg
Science in the Open (blog) » About
I currently have a position as Professor of Research Communications at the Centre for Culture and Technology at Curtin University (AUS)
openscience  openscholarship  openaccess 
october 2016 by MicrowebOrg
#Siggenthesen – Merkur
Siggener Thesen zum wissenschaftlichen Publizieren im digitalen Zeitalter

Das digitale Publizieren ermöglicht bessere Arbeits- und Erkenntnisprozesse in der Wissenschaft. Diese Potenziale werden aus strukturellen Gründen gegenwärtig noch viel zu sehr blockiert. Wir möchten, dass sich das ändert, und stellen deswegen die folgenden Thesen zur Diskussion:

1

Digitales Publizieren braucht verlässliche Strukturen statt befristeter Projekte. #Siggenthesen #1

Innovationen im Bereich digitaler Publikationsformate in der Wissenschaft, die in Pilot- und Inselprojekten entwickelt werden, bedürfen einer gesicherten Überführung in dauerhaft angelegte, institutionen- und disziplinenübergreifende Infrastrukturen, um im Sinne der Wissenschaftsgemeinschaft nachhaltige und wettbewerbsfähige Angebote liefern zu können. Wir rufen sowohl Fördereinrichtungen und politische Instanzen als auch Verlage und Bibliotheken auf, sich dieser Verantwortung zu stellen und entsprechende Förder- und Integrationskonzepte im bestehenden Wissenschaftsbetrieb konkret und umgehend umzusetzen. Eine systemische Veränderung hin zum digitalen Publizieren kann nur durch ein verlässliches Angebot exzellenter Dienstleistungen erreicht werden.
mikrobuch:uni20  openaccess  openscience 
october 2016 by MicrowebOrg
A Simple Explanation for the Replication Crisis in Science · Simply Statistics
My primary contention here is
The replication crisis in science is concentrated in areas where (1) there is a tradition of controlled experimentation and
(2) there is relatively little basic theory underpinning the field.

-- weil gute Theorie + vielfältige unsichere Observation (Astronomie) = Bestätigung durch Wiederholung
-- schlechte Theorie + vielfältige unsichere Observation (Epidemiologie),
-- gute Theorie und kontrolliertes Experiment: Partikelphysik (MUSTER)
-- schlechte Theorie + kontrolliertes Experiment



Astronomy and Epidemiology

What do the fields of astronomy and epidemiology have in common? You might think nothing. Those two departments are often not even on the same campus at most universities! However, they have at least one common element, which is that the things that they study are generally reluctant to be controlled by human beings. As a result, both astronomers and epidemiologist rely heavily on one tools: the observational study. Much has been written about observational studies of late, and I’ll spare you the literature search by saying that the bottom line is they can’t be trusted (particularly observational studies that have not been pre-registered!).

But that’s fine—we have a method for dealing with things we don’t trust: It’s called replication.

My understanding is that astronomers have a similar mentality as well—no single study will result in anyone believe something new about the universe. Rather, findings need to be replicated using different approaches, instruments, etc.

The key point here is that in both astronomy and epidemiology expectations are low with respect to individual studies. It’s difficult to have a replication crisis when nobody believes the findings in the first place. Investigators have a culture of distrusting individual one-off findings until they have been replicated numerous times. In my own area of research, the idea that ambient air pollution causes health problems was difficult to believe for decades, until we started seeing the same associations appear in numerous studies conducted all around the world. It’s hard to imagine any single study “proving” that connection, no matter how well it was conducted.

One large category of methods includes the controlled experiment. Controlled experiments come in a variety of forms, whether they are laboratory experiments on cells or randomized clinical trials with humans, all of them involve intentional manipulation of some factor by the investigator in order to observe how such manipulation affects an outcome. In clinical medicine and the social sciences, controlled experiments are considered the “gold standard” of evidence. Meta-analyses and literature summaries generally weight publications with controlled experiments more highly than other approaches like observational studies.

VORHERSAGE: Physik vs. Medizin
whether a field has a strong basic theoretical foundation. The idea here is that some fields, like say physics, have a strong set of basic theories whose predictions have been consistently validated over time. Other fields, like medicine, lack even the most rudimentary theories that can be used to make basic predictions.

We need to stop thinking that any single study is definitive or confirmatory, no matter if it was a controlled experiment or not. Science is always a cumulative business, and the value of a given study should be understood in the context of what came before it.

>> Psycho-Experimente:
Kann man nehmen als "zugelassene Hypothese" (Ersatz für Theorie), mit der man dann spielen kann, aber NICHT als Beweis.
openscience  replicationcrisis 
october 2016 by MicrowebOrg
Felix Schönbrodt's blog
The Reproducibility Project: Psychology was published last week, and it was another blow to the overall credibility of the current research system’s output.

Some interpretations of the results were in a “Hey, it’s all fine; nothing to see here; let’s just do business as usual” style. Without going into details about the “universal hidden moderator hypothesis” (see Sanjay’s blog for a reply) or “The results can easily explained by regression to the mean” (see Moritz’ and Uli’s reply): I do not share these optimistic views, and I do not want to do “business as usual”.

What makes me much more optimistic about the state of our profession than unfalsifiable post-hoc “explanations” is that there has been considerable progress towards an open science, such as the TOP guidelines for transparency and openness in scientific journals, the introduction of registered reports, or the introduction of the open science badges (Psych Science has increased sharing of data and materials from near zero to near 25%38% in 1.5 years, simply by awarding the badges). And all of this happend within the last 3 years!

Beyond these already beneficial changes, we asked ourself: What can we do on the personal and local department level to make more published research true?

A first reaction was the foundation of our local Open Science Committee (more about this soon).

Own Research

Open Data: Whenever possible, we publish, for every first-authored empirical publication, all raw data which are necessary to reproduce the reported results on a reliable repository with high data persistence standards (such as the Open Science Framework).

Reproducible scripts: For every first authored empirical publication we publish reproducible data analysis scripts, and, where applicable, reproducible code for simulations or computational modeling.

We provide (and follow) the “21-word solution” in every empirical publication: “We report how we determined our sample size, all data exclusions (if any), all manipulations, and all measures in the study.”1 If necessary, this statement is adjusted to ensure that it is accurate.

As co-authors we try to convince the respective first authors to act accordingly.
openscience 
october 2016 by MicrowebOrg
The 20% Statistician: Improving Your Statistical Inferences Coursera course
Improving Your Statistical Inferences Coursera course


I’m really excited to be able to announce my “Improving Your Statistical Inferences” Coursera course. It’s a free massive open online course (MOOC) consisting of 22 videos, 10 assignments, 7 weekly exams, and a final exam. All course materials are freely available, and you can start whenever you want.

In this course, I try to teach all the stuff I wish I had learned when I was a student. It includes the basics (e.g., how to interpret p-values, what likelihoods and Bayesian statistics are, how to control error rates or calculate effect sizes) to what I think should also be the basics (e.g., equivalence testing, the positive predictive value, sequential analyses, p-curve analysis, open science). The hands on assignments will make sure you don’t just hear about these things, but know how to use them.

My hope is that busy scholars who want to learn about these things now have a convenient and efficient way to do so. I’ve taught many workshops, but there is only so much you can teach in one or two days.
mooc  statisticalthought  coursera  openscience 
october 2016 by MicrowebOrg
OS Committe (LMU 2015) Felix Schönbrodt's blog
Krise der psych. Forschung
60% nicht replizierbar
Transparenz ist nötig für GUTE Forschung
Open Data ((wie die Daten von ROGOFF!! Story))

The committee’s mission and goals include:

Monitor the international developments in the area of open science and communicate them to the department.
Organize workshops that teach skills for open science (e.g., How do I write a good pre-registration? What practical steps are necessary for Open Data? How can I apply for the Open Science badges?, How to do an advanced power analysis, What are Registered Reports?).
Develop concrete suggestions concerning tenure-track criteria, hiring criteria, PhD supervision and grading, teaching, curricula, etc.
Channel the discussion concerning standards of research quality and transparency in the department. Even if we share the same scientific values, the implementations might differ between research areas. A medium-term goal of the committee is to explore in what way a department-wide consensus can be established concerning certain points of open science.

The OSC developed some first suggestions about appropriate actions that could be taken in response to the replication crisis at the level of our department. We focused on five topics:

Supervision and grading of dissertations
Voluntary public commitments to research transparency and quality standards (this also includes supervision of PhDs and coauthorships)
Criteria for hiring decisions
Criteria for tenure track decisions
How to allocate the department’s money without setting incentives for p-hacking
openscience 
october 2016 by MicrowebOrg
Barcamp Science 2.0: Open Science in der Praxis? Einfach anfangen! | Leibniz-Forschungsverbund Science 2.0
Leibniz Research Alliance Science 2.0

K Barcamp Science 2.0, Open Science, Science 2.0

c No comment

Offen, innovativ und wissbegierig – so könnten die Teilnehmenden und auch die Sessions des zweiten Barcamp Science 2.0 anlässlich der dritten Science 2.0 Conference in Köln zusammengefasst werden. Nach dem Motto “Putting Science 2.0 and Open Science into Practice” wurde sich ausgetauscht und gleichzeitig viel Gesprächsstoff für weitere Diskussionen gesammelt.

Wie kann man Open Science (nachhaltig) praktizieren und welche Infrastruktur muss dafür bereitstehen?

Infrastructure for Open Science [Pad] [Podcast]
Practicalities of data sharing [Pad] [Podcast]
Data formats for Open Science [Pad]
Package Management for research projects [Pad]

Wie kann man Forschende von Open Science überzeugen? Welche Vor- und Nachteile sind zu beachten? Was ist für Forschende wichtig und welche Anreize sind für sie interessant?

Incentives for Open Science [Pad] [Podcast]
Teaching Open Science [Pad] [Podcast]
Is Open Science bad for Science? [Pad]

Welche Tools werden genutzt und was sind Best Practice-Beispiele? Wie können bereits bestehende Werkzeuge genutzt werden, um Datenmengen zu bearbeiten?

Tools for Open Science [Pad] [Podcast]
Jupyter Notebooks [Pad]
Wikipedia & Wikidata as a workbench [Pad] [Podcast]
Analyzing scholarly tweets [Pad]
Structuring research and publications [Pad] [Podcast]

Was ist bei Open Access-Veröffentlichungen und Peer Reviews zu beachten? Welche (positiven wie negativen) Auswirkungen hat SciHub?

Peer Review [Pad]
Preregistration for publications [Pad] [Podcast]
SciHub good or bad [Pad] [Podcast]
openscience 
october 2016 by MicrowebOrg
Towards Open Science: The Case for a Decentralized Autonomous Academic Endorsement System | Zenodo
The current system of scholarly communication is based on tradition, and does not correspond to the requirements of modern research.

The dissemination of scientific results is mostly done in the form of conventional articles in scientific journals, and has not evolved with research practice.

In this paper, we propose a system of academic endorsement based on blockchain technology that is decoupled from the publication process, which will allow expeditious appraisal of all kinds of scientific output in a transparent manner without relying on any central authority.
openscience  blockchain 
september 2016 by MicrowebOrg
open science
Qualität in top-ranking journals laut studien nicht besser, nach indizien sogar eher schlechter.
(resultate werden übertrieben usw.)
bjoernbrembs  openscience 
september 2016 by MicrowebOrg
chem-bla-ics: Doing science has just gotten even harder
A second realization is that few scientists understand or want to understand copyright law. The result is hundreds of scholarly databases which do not define who owns the data, nor under what conditions you are allowed to reuse it, or share, or reshare, or modify. Yet scientists do. So, not only do these database often not specify the copyright/license/waiver (CLW) information, the certainly don't really tell you how they populated their database. E.g. how much they copied from other websites, under the assumption that knowledge is free. Sadly, database content is not. Often you don't even need wonder about it, as it is evident or even proudly said they used data from another database. Did they ask permission for that? Can you easily look that up? Because you are now only allowed to link to that database until you figured out if they data, because of the above quoted argument. And believe me, that is not cheap.

Combine that, and you have this recipe for disaster.
Furthermore, when hyperlinks are posted for profit, it may be expected that the person who posted such a link should carryout the checks necessary to ensure that the work concerned is not illegally published.

A community that knows these issues very well, is the open source community. Therefore, you will find a project like Debian to be really picky about licensing: if it is not specified, they won't have it. This is what is going to happen to data too.
openscience 
september 2016 by MicrowebOrg
Academic Torrents
We are a community-maintained distributed repository for datasets and scientific knowledge

Welcome to Academic Torrents!
Making 15.47TB of research data available.

We've designed a distributed system for sharing enormous datasets - for researchers, by researchers. The result is a scalable, secure, and fault-tolerant repository for data, with blazing fast download speeds. Contact us at contact@academictorrents.com.
openscience 
august 2016 by MicrowebOrg
hcommons.org (humanities plattform)
Connect with Fellow Humanists
Support Open Access to Research
Humanities Commons Is Open and Not-for-Profit
Brought to you by a consortium of trusted not-for-profit organizations

Explore new modes of scholarship as you share, find, and create your own digital projects. 
Publish your work and increase its visibility with a professional profile and Web site.
Join groups focused on a research or teaching topic, event, or advocacy project—or create your own.

Humanities Commons will help you . . .

Humanities Commons includes a library-quality open-access repository for interdisciplinary scholarship called CORE. The first of its kind, CORE allows users to preserve their research and increase its audience by sharing across disciplinary, institutional, and geographic boundaries.

Humanities Commons is designed to serve the unique needs of humanists as they engage in teaching and research that benefits the larger community. Unlike other social and academic networks,
Humanities Commons is entirely open access, open source, and not-for-profit. It is focused on providing a space to discuss, share, and store cutting-edge research and innovative pedagogy—not on generating profits from users' intellectual and personal data.

Use Humanities Commons to . . .

Host an online conference or continue the conversation after the event.

Store and share your articles, syllabi, data sets, and presentations in a library-quality digital repository.

Connect and collaborate with others who work in the humanities.
Humanities Commons, a project spearheaded by the Modern Language Association (MLA), links online community spaces for the MLA; College Art Association; Association for Jewish Studies; and the Association for Slavic, East European, and Eurasian Studies.

These partners have collaborated to create Humanities Commons—a crossdisciplinary hub for anyone interested in humanities research and scholarship. As other not-for-profit humanities organizations join the partnership, Humanities Commons will grow even larger. 
Humanities Commons is funded by a generous grant from the Andrew W. Mellon Foundation. Recognizing the need for an online professional network for—and by—humanists, the Mellon Foundation supported the development of Commons sites for partner societies and the shared identity-management system that connects these sites and their users to a larger Humanities Commons network.

Brought to you by a consortium of not-for-profit humanities organizations, Humanities Commons is an open-access and open-source digital platform.
ANYONE, anywhere will be able to create a free account and participate in this vibrant intellectual community. 
If you work in the humanities, Humanities Commons is your space for collaborating with colleagues across disciplines, sharing teaching tools, and building a professional profile. With your free account, you can create a Web site, engage in community discussions, and more. 
mikrobuch:uni20:open  openaccess  openscience  hcommons  digitalhumanities 
august 2016 by MicrowebOrg
Project Jupyter | Home
The Jupyter Notebook is a web application that allows you to create and share documents that contain live code, equations, visualizations and explanatory text. Uses include: data cleaning and transformation, numerical simulation, statistical modeling, machine learning and much more.
openscience 
june 2016 by MicrowebOrg
Open Science | Telepolis
"Der Vorwurf, meine Doktorarbeit sei ein Plagiat, ist abstrus. Ich bin gerne bereit zu prüfen, ob bei über 1.200 Fußnoten und 475 Seiten vereinzelt Fußnoten nicht oder nicht korrekt gesetzt sein sollten und würde dies bei einer Neuauflage berücksichtigen."1

So antwortete der ehemalige Verteidigungsminister Karl-Theodor zu Guttenberg auf die Frage, ob Teile seiner Dissertation womöglich aus anderen Werken abgeschrieben worden sein könnten, ohne dass er dies ausreichend kenntlich gemacht hatte.

Verstehen Sie, was ich damit meine? Es werden zwar die Forschungsergebnisse mundgerecht präsentiert, aber die Prozesse, die zu diesen Ergebnissen führen, bleiben im Dunkeln. Außenstehende bekommen ein Produkt geliefert, können aber nicht nachvollziehen, wie es entstanden ist und welche Gedanken bei der Erstellung verfolgt und verworfen wurden. Sie sehen vor allem nicht, welche Probleme es auf dem Weg zu lösen gab, welche Fehler gemacht und welche Lehren aus ihnen gezogen wurden. Auch solche Dinge gehören zur Wissenschaft. Wenn man die weglässt, entsteht ein völlig falsches Bild. Und dann wundert man sich, wieso die Menschen nicht verstehen, was Herr zu Guttenberg so Schlimmes getan hat.

Von 1.0 zu 2.0

Nun gibt uns aber speziell das Internet die Möglichkeit an die Hand, dagegen etwas zu tun. Wir bekommen nämlich einen Rückkanal, und der ändert eine ganze Menge. Im einfachsten Fall können Wissenschaftler etwa in Blogs über Themen aus ihrem Fachgebiet berichten und Fragen von Interessierten dazu beantworten. Es wird ein zügiger direkter Austausch möglich, doch damit ist das Potenzial noch längst nicht ausgeschöpft. Da ist noch Luft.

Und alle diese Beispiele würde ich mit Open Science 2.0 betiteln. Es geht nicht um das Präsentieren von fertigen Inhalten, sondern um das Erstellen, Prüfen, Verbessern dieser Inhalte durch Forscher, Praktiker und begeisterte Amateure. Wer an der Entwicklung von Wissen mitwirkt, versteht viel besser, was Wissenschaft eigentlich ausmacht und bedeutet. Andersherum bleiben Forscher vielleicht eher auf dem Boden der Tatsachen und erhalten so den Blick für das Ganze zurück, der bei ihrer Spezialisierung verloren gegangen sein könnte.

Der ehemalige Bundeskanzler Helmut Schmidt ist jedenfalls der Ansicht, Wissenschaft sei "eine zur sozialen Verantwortung verpflichtete Erkenntnissuche".8 und müsse sich um die großen Menschheitsprobleme wie Überbevölkerung, Klimawandel, Globalisierung der Ökonomie oder die weltweite militärische Hochrüstung kümmern. Dabei ist die Kooperation Vieler gefragt, unabhängiger Experten ebenso wie betroffener Amateure.
guttenberg  plagiat  schavan  openscience 
may 2016 by MicrowebOrg
Coko Foundation
Despite fundamental shifts in how humans use technology in research, mass communication and popular media, we are still publishing like it’s 1999. At the Collaborative Knowledge Foundation, we’ve set our sights on transforming the research communication sector by building shared infrastructure that will improve what we publish and increase the integrity and speed of the process.
openscience  uni2.0:avantgarde 
april 2016 by MicrowebOrg
Konrad Förstner
Currently, I am the head of the bioinformatics group at the Core Unit Systems Medicine, University of Würzburg, Germany. I am advocating openess of source code, science, data, education, content - basically everything - and am an active member of the Open Science group of Open Knowledge.
uni2.0:avantgarde  openscience 
april 2016 by MicrowebOrg
Schwerpunktinitiative "Digitale Information": Start
Die Schwerpunktinitiative "Digitale Information" ist eine gemeinsame Initiative der Allianz der Wissenschaftsorganisationen zur Verbesserung der Informationsversorgung in Forschung und Lehre.

Mit der Initiative verfolgen die Wissenschaftsorganisationen das Ziel,

digitale Publikationen, Forschungsdaten und Quellenbestände möglichst umfassend und offen bereit zu stellen und damit auch ihre Nachnutzbarkeit in anderen Forschungskontexten zu gewährleisten,

via Helmholtz, Kiel
openscience 
april 2016 by MicrowebOrg
Positionspapier „Research data at your fingertips“ - 2015_Positionspapier_AG_Forschungsdaten.pdf
I. Vision
2025
„Research data at your fingertips“
Wissenschaftlerinnen und Wissenschaftler aller Disziplinen können auf alle
Forschungsd
aten
einfach, schnell und ohne großen Aufwand zugreifen, um auf höchstem Niveau zu forschen und
exzellente Ergebnisse zu erzielen. Sie können gemeinsam mit anderen arbeiten und ihre For-
schungsergebnisse sicher aufbewahren. Forschungsdaten stehen dabei in e
iner Form zur Ver-
fügung, die Forschung sowohl über disziplinäre als auch über nationale Grenzen hinweg ermög-
licht und
erleichtert
.
D
ie Veröffentlichung von Forschungsdaten und Software steigert die wissenschaftliche Reputa-
tion. Wissenschaftlerinnen und Wis
senschaftler werden beim Sammeln, Erheben, Erfassen und
beim Management ihrer Daten unterstützt.
Leicht nutzbare digitale Infrastrukturen sowie w
issenschaftliche und technische Informationsspe-
zialistinnen und
-
spezialisten unterstützen den vollständigen Fo
rschungszyklus
openscience 
april 2016 by MicrowebOrg
Kiel Thilo Paul-Stüve - Open Data // Welcome - GEOMAR
Welcome to the Data Management Portal
for Kiel Marine Sciences hosted at GEOMAR
openscience 
april 2016 by MicrowebOrg
Offene Wissenschaft > Open Knowledge Foundation Deutschland
Der Begriff Open Science (Offene Wissenschaft) bündelt Strategien und Verfahren, die allesamt darauf abzielen, die Chancen der Digitalisierung konsequent zu nutzen, um alle Bestandteile des wissenschaftlichen Prozesses über das Internet offen zugänglich und nachnutzbar zu machen. Damit sollen Wissenschaft, Gesellschaft und Wirtschaft neue Möglichkeiten im Umgang mit wissenschaftlichen Erkenntnissen eröffnet werden.

Die deutschsprachige OKF-Arbeitsgruppe »Open Science«

Für den Bereich Wissenschaft konstituierte sich am 16.7.2014 im Rahmen des OKFestivals in Berlin eine deutschsprachige Open Science Arbeitsgruppe. Ziel der Arbeitsgruppe ist die Vernetzung von Aktiven im Bereich Öffnung von Wissenschaft und Forschung (Open Science) und die Erarbeitung rechtssicherer Rahmenbedingungen für das Veröffentlichen von Forschungsergebnissen. Zusätzlich soll die Arbeitsgruppe die Zusammenarbeit mit anderen internationalen Open Science Gruppen koordinieren und als Ansprechpartner für Forscher, Institute, Zivilgesellschaft, Wirtschaft und Politik zum Thema Open Science fungieren.
openscience 
april 2016 by MicrowebOrg
Helmholtz Open Science: Newsletter 49 vom 12.06.2014
Der Begriff Open Science umfasst aber auch die Öffnung des gesamten Wissenschaftsprozesses im Sinn einer „intelligent openness“ (Boulton, G. et al. 2012: Science as an open enterprise. London: Royal Society).

Die Helmholtz-Gemeinschaft fördert Open Science, also den offenen Zugang zu wissenschaftlichem Wissen, seine Verifizierbarkeit und Nachnutzbarkeit sowie seinen Transfer in die Gesellschaft und setzt damit einen Prozess fort, der 2003 mit der Erstunterzeichnung der „Berliner Erklärung über den offenen Zugang zu wissenschaftlichem Wissen“ begann.

OPEN ACCESS
- Publ.
- Data
- Software/Algos
openscience 
april 2016 by MicrowebOrg
Offene Wissenschaft – Wikipedia
In den 1990er Jahren wurde der Begriff der ‚Öffentlichen Wissenschaft‘ neu und entscheidend für den deutschen Sprachraum von der Soziologin und Kulturwissenschaftlerin Caroline Y. Robertson-von Trotha geprägt. In den Eröffnungsreden der Karlsruher Gespräche von 1997 und 1998 entwarf sie einen Begriff der ‚Öffentlichen Wissenschaft‘ als Synonym einer interdisziplinären und dialogbasierten Wissenschaftskommunikation.[3][4] In der Folge bettete sie das Konzept in den historisch-soziologischen Kontext ein[5][6] und führte im Jahr 2012 eine erste von mehreren Analysen „im Spiegel der Web 2.0-Kultur“[7] durch.[8] Zugleich etablierte sie als Gründungsdirektorin des ZAK in Karlsruhe ihre Konzeption der ‚Öffentlichen Wissenschaft in Theorie und Praxis‘ auch institutionell: Neben der Forschung und der Lehre bildet diese eine der drei gleichberechtigten Säulen, auf denen das Zentrum basiert.[9][10]
openscience 
april 2016 by MicrowebOrg

Copy this bookmark:



description:


tags: