quality   13229

« earlier    

Stack Overflow Culture – Jon Skeet's coding blog
But ignoring emotions is a really bad idea, because we’re all human. What may well happen in that situation – even if I’ve been polite throughout – is that the asker will decide that Stack Overflow is full of “traffic cop” moderators who only care about wielding power. I could certainly argue that that’s unfair – perhaps highlighting my actual goals – but that may not change anyone’s mind.

So that’s one problem. How does the Stack Overflow community agree what the goal of site is, and then make that clearer to users when they ask a question? It’s worth noting that the tour page (which curiously doesn’t seem to be linked from the front page of the site any more) does include this text:

With your help, we’re working together to build a library of detailed answers to every question about programming.

I tend to put it slightly differently:

The goal of Stack Overflow is to create a repository of high-quality questions, and high-quality answers to those questions.

Is that actually a shared vision? If askers were aware of it, would that help? I’d like to hope so, although I doubt that it would completely stop all problems. (I don’t think anything would. The world isn’t a perfect place.)
quality  culture  learning 
10 hours ago by janpeuker
Licensed to Web
"Generally, I also wonder how realistic it is to expect licensing to help solve the issues we’re facing (and for the record, Mike never claims licensing will solve them). You don’t have to look far to find folks in all sorts of professions—cops, accountants, doctors—who are fully licensed and yet wreck terrible havoc. Licensing can help ensure a level of proficiency, but that’s not our problem. Our challenge is a matter of ethics, a matter of responsible consideration of the consequences of what we build. Skill does not equate to better ethics."
quality  license  ethics  Web  regulation  clevermarks 
2 days ago by nhoizey
Development at Honeycomb: Crossing the Observability Bridge to Production | Honeycomb
By being curious and empowered to leverage production data to really explore what our services are doing, we as developers can inform not only what features we build / bugs we fix, but also:

How to build those features / fix those bugs
How features and fixes are scoped
How you verify correctness or completion
How you roll out that feature or fix
When wondering… how to make a change
There’s a difference between knowing that something can be changed or built and knowing how it should be. Understanding the potential impact of a change we’re making—especially something that’ll have a direct, obvious impact on users—lets us bring data into the decisionmaking process.

By learning about what “normal” is (or at least—what “reality” is), we can figure out whether our fix is actually a fix or not.
devops  quality  monitoring 
3 days ago by janpeuker
Delay Within the 3-Hour Surviving Sepsis Campaign Guideline... : Critical Care Medicine
The statistically significant time in minutes after which a delay increased the risk of death for each recommendation was as follows: lactate, 20.0 minutes; blood culture, 50.0 minutes; crystalloids, 100.0 minutes; and antibiotic therapy, 125.0 minutes.
Sepsis  Quality  Guidelines 
3 days ago by nhorning615
Risky vs Safe Coding: Roles of the Programmer, The Technical Leader and the Business
This article is about safe production quality programming practices, Feature Analysis and review, Design and Prototyping, Performance and Test Driven Development, Clean Code, Technical Debt…
programming  code  quality  productivity  opinion  software  engeneering 
5 days ago by gilberto5757
Automated code reviews & code analytics | Codacy
Codacy automates code reviews and monitors code quality over time. Static analysis, code coverage and metrics for Ruby, JavaScript, PHP, Scala, Java, Python, CoffeeScript and CSS.
code  quality  tool  analysis  github 
6 days ago by lgtout
A social code analysis service to prioritize technical debt and rescue legacy code
quality  library  analytics  engineering 
7 days ago by janpeuker
Software (r)Evolution — Part 1: Predict Maintenance Problems in Large Codebases | Empear
Adam Tornhill - Software Design X-Rays
Towards an Evolutionary View of Software
As part of my day job at Empear, I’ve analyzed hundreds of different codebases. There are some patterns that I see recur over and over again, independent of programming languages and technology. Uncovering these patterns help us understand large codebases.
A language-neutral complexity metric
There have been several attempts at measuring software complexity. The most well-known approaches are McCabe Cyclomatic Complexity or Halstead complexity measures. The major drawback of these metrics is that they are language specific. That is, we need one implementation for each of the programming languages that we use to build our system. This is at conflict with our goal of providing language-neutral metrics to get a holistic overview of modern polyglot codebases.

Fortunately, there’s a much simpler metric that performs well enough: the number of lines of code. Yes, sure, the number of lines of code is a rough metric, yet it has just as good predictive power as more elaborate metrics like cyclomatic complexity. The advantage of using lines of code lies in the simplicity of the metric; Lines of code is both language neutral and intuitive to reason about. So let’s use lines of code as a proxy for complexity and combine it with a measure of change frequency to identify Hotspots in our codebase.

Identify high risk changes with Hotspots
Calculate Hotspots

A hotspot is complicated code that you have to work with often. Hotspots are calculated from two different data sources:

We use the lines of code as a simple proxy for complexity.
We calculate the change frequency of each file by mining their version-control history.
CodeScene for software analysis provides its Hotspot analysis as an interactive map that lets you explore your whole codebase interactively. In the following visualizations, each file is represented as a circle as described above:
quality  analytics  management  engineering  book 
7 days ago by janpeuker
The Whelming › The Big Ones
Update: After fixing an import error, and cross-matching of BNF-supplied VIAF data, 18% of BNF people are matched in Wikidata. This has been corrected in the text. My mix’n’match tool holds a lot of entries from third-party catalogs – 21,795,323 at the time of writing. via Pocket
metadata  quality  tools  wikipedia 
7 days ago by kintopp
1st Workshop on Quality of Open Data (QOD 2018) – BIS
The goal of this workshop is to bring together different communities working on quality of information in Wikipedia, DBpedia, Wikidata and other open knowledge bases. The workshop calls for sharing research experience and knowledge related to quality assessment in open data. via Pocket
cfp  data  diglib  open  quality  wikipedia  workshop 
7 days ago by kintopp
Canary Analysis Service - ACM Queue
Canarying is a very useful method of increasing production safety, but it is not a panacea. It should not replace unit testing, integration testing, or monitoring.

Attempting a "perfectly accurate" canary setup can lead to a rigid configuration, which blocks releases that have acceptable changes in behavior. When a system inherently does not lend itself to a sophisticated canary, it's tempting to forego canarying altogether.

Attempts at hyper-accurate canary setups often fail because the rigid configuration causes too much toil during regular releases
test  quality  devops 
8 days ago by janpeuker
Philosophy | suckless.org software that sucks less
tools, with a focus on simplicity, clarity and frugality. Our philosophy is about keeping things simple, minimal and usable. We believe this should become the mainstream philosophy in the IT sector. Unfortunately, the tendency for complex, error-prone and slow software seems to be prevalent in the present-day software industry. We intend to prove the opposite with our software projects.

Our project focuses on advanced and experienced computer users. In contrast with the usual proprietary software world or many mainstream open source projects that focus more on average and non-technical end users, we think that experienced users are mostly ignored. This is particularly true for user interfaces, such as graphical environments on desktop computers, on mobile devices, and in so-called Web applications. We believe that the market of experienced users is growing continuously, with each user looking for more appropriate solutions for his/her work style.
linux  opensource  culture  quality 
8 days ago by janpeuker

« earlier    

related tags

-  3d  3dprinting  a  accessibility  advertising  ai  air  alien  allorganichome  amazon  america  analysis  analytics  animals  apfs  apple  appstore  architecture  art  article  async  audio  bedding  benchmark  bestpractices  bibliography  bitcoin  blob  book  books  bugs  bullshit  business-related  business  c  callbacks  can  cfp  champagne  charity  chi  child's  cinematography  citation  cities  clang  clevermarks  clinical  clojure  closure  code  coffee  collaboration  communication  community  computers  content  coolstuff  copyright  costs  creativity  crime  criteria  croydon  culture  customerservice  data-related  data  database  dates  design  dev  development  devops  digital  diglib  din  discrimination  diy  doctorwho  documentation  drinking  drugs  dry  economics  economy  ehr  en-net  energy  engeneering  engineering  enterprise-software  enumerations  environment  ethical  ethics  eu  europe  evaluate  evidence  exercise  facebook  fashion  feinstaub  filmmaking  fix  food  frames  framing  free  funny  gadget  games  gender  germany  github  google  gov.uk  government  guide  guidelines  hash  headunit  health  heuristic  history  home  homebridge  homekit  house  how  howto  html  html5  http  improve  improvement  indie  information  interesting  interface  ios  iot  iowa  iphone  issue  it  javascript  jm  journalism  lack  land  law  leadership  learning  lego  letters  library  license  life  link  linkbuilding  linkeddata  linux  local  london  luft  mac  machinelearning  macos  mainstream  major  management  manufacturing  marketing  maths  measurement  medical  medicare  mentalhealth  mercy  metadata  method  mobile  model  money  monitoring  movie  movies  norway  of  offline  open-source  open  opensource  opinion  organic  osx  paper  pdf  perception  performance  philosophy  picture-framing  pm  podcast  politics  prediction  predictions  process  productivity  professional  programming  programming_personality_match  programming_style  projectmanagement  property  psychology  qa  qmv  refactoring  reference  regulation  research  restaurants  retro  review  science  scotland  sds011  security  sensor  seo  sepsis  services  serviceworker  serviceworkers  sex  shopping  simplicity  sleep  social  software-development  software-engineering  software  softwaretesting  solutions  standards  startup  statistics  stories  strategy  sustainability  technology  test  testing  that  there’s  this  to  tool  toolkit  tools  troubleshooting  tv  twitter  uit.tromso  uk  unix  using  validation  video  visualization  volunteering  vr/ar  wants  war  watchkit  watchos  web  webdev  wellbeing  wikipedia  wine  work  workshop  writing  xcode  your  zit  —...   

Copy this bookmark: