nhaliday + ieee   57

Errors in Math Functions (The GNU C Library)
https://stackoverflow.com/questions/22259537/guaranteed-precision-of-sqrt-function-in-c-c
For C99, there are no specific requirements. But most implementations try to support Annex F: IEC 60559 floating-point arithmetic as good as possible. It says:

An implementation that defines __STDC_IEC_559__ shall conform to the specifications in this annex.

And:

The sqrt functions in <math.h> provide the IEC 60559 square root operation.

IEC 60559 (equivalent to IEEE 754) says about basic operations like sqrt:

Except for binary <-> decimal conversion, each of the operations shall be performed as if it first produced an intermediate result correct to infinite precision and with unbounded range, and then coerced this intermediate result to fit in the destination's format.

The final step consists of rounding according to several rounding modes but the result must always be the closest representable value in the target precision.

[ed.: The list of other such correctly rounded functions is included in the IEEE-754 standard (which I've put w/ the C1x and C++2x standard drafts) under section 9.2, and it mainly consists of stuff that can be expressed in terms of exponentials (exp, log, trig functions, powers) along w/ sqrt/hypot functions.

Fun fact: this question was asked by Yeputons who has a codeforces profile.]
https://stackoverflow.com/questions/20945815/math-precision-requirements-of-c-and-c-standard
oss  libraries  systems  c(pp)  numerics  documentation  objektbuch  list  linux  unix  multi  q-n-a  stackex  programming  nitty-gritty  sci-comp  accuracy  types  approximation  IEEE  protocol-metadata  gnu 
july 2019 by nhaliday
Computer latency: 1977-2017
If we look at overall results, the fastest machines are ancient. Newer machines are all over the place. Fancy gaming rigs with unusually high refresh-rate displays are almost competitive with machines from the late 70s and early 80s, but “normal” modern computers can’t compete with thirty to forty year old machines.

...

If we exclude the game boy color, which is a different class of device than the rest, all of the quickest devices are Apple phones or tablets. The next quickest device is the blackberry q10. Although we don’t have enough data to really tell why the blackberry q10 is unusually quick for a non-Apple device, one plausible guess is that it’s helped by having actual buttons, which are easier to implement with low latency than a touchscreen. The other two devices with actual buttons are the gameboy color and the kindle 4.

After that iphones and non-kindle button devices, we have a variety of Android devices of various ages. At the bottom, we have the ancient palm pilot 1000 followed by the kindles. The palm is hamstrung by a touchscreen and display created in an era with much slower touchscreen technology and the kindles use e-ink displays, which are much slower than the displays used on modern phones, so it’s not surprising to see those devices at the bottom.

...

Almost every computer and mobile device that people buy today is slower than common models of computers from the 70s and 80s. Low-latency gaming desktops and the ipad pro can get into the same range as quick machines from thirty to forty years ago, but most off-the-shelf devices aren’t even close.

If we had to pick one root cause of latency bloat, we might say that it’s because of “complexity”. Of course, we all know that complexity is bad. If you’ve been to a non-academic non-enterprise tech conference in the past decade, there’s a good chance that there was at least one talk on how complexity is the root of all evil and we should aspire to reduce complexity.

Unfortunately, it's a lot harder to remove complexity than to give a talk saying that we should remove complexity. A lot of the complexity buys us something, either directly or indirectly. When we looked at the input of a fancy modern keyboard vs. the apple 2 keyboard, we saw that using a relatively powerful and expensive general purpose processor to handle keyboard inputs can be slower than dedicated logic for the keyboard, which would both be simpler and cheaper. However, using the processor gives people the ability to easily customize the keyboard, and also pushes the problem of “programming” the keyboard from hardware into software, which reduces the cost of making the keyboard. The more expensive chip increases the manufacturing cost, but considering how much of the cost of these small-batch artisanal keyboards is the design cost, it seems like a net win to trade manufacturing cost for ease of programming.

...

If you want a reference to compare the kindle against, a moderately quick page turn in a physical book appears to be about 200 ms.

https://twitter.com/gravislizard/status/927593460642615296
almost everything on computers is perceptually slower than it was in 1983
https://archive.is/G3D5K
https://archive.is/vhDTL
https://archive.is/a3321
https://archive.is/imG7S
techtariat  dan-luu  performance  time  hardware  consumerism  objektbuch  data  history  reflection  critique  software  roots  tainter  engineering  nitty-gritty  ui  ux  hci  ios  mobile  apple  amazon  sequential  trends  increase-decrease  measure  analysis  measurement  os  systems  IEEE  intricacy  desktop  benchmarks  rant  carmack  system-design  degrees-of-freedom  keyboard  terminal  editors  links  input-output  networking  world  s:**  multi  twitter  social  discussion  tech  programming  web  internet  speed  backup  worrydream  interface  metal-to-virtual  latency-throughput  workflow  form-design  interface-compatibility 
july 2019 by nhaliday
Hardware is unforgiving
Today, anyone with a CS 101 background can take Geoffrey Hinton's course on neural networks and deep learning, and start applying state of the art machine learning techniques in production within a couple months. In software land, you can fix minor bugs in real time. If it takes a whole day to run your regression test suite, you consider yourself lucky because it means you're in one of the few environments that takes testing seriously. If the architecture is fundamentally flawed, you pull out your copy of Feathers' “Working Effectively with Legacy Code” and you apply minor fixes until you're done.

This isn't to say that software isn't hard, it's just a different kind of hard: the sort of hard that can be attacked with genius and perseverance, even without experience. But, if you want to build a ship, and you "only" have a decade of experience with carpentry, milling, metalworking, etc., well, good luck. You're going to need it. With a large ship, “minor” fixes can take days or weeks, and a fundamental flaw means that your ship sinks and you've lost half a year of work and tens of millions of dollars. By the time you get to something with the complexity of a modern high-performance microprocessor, a minor bug discovered in production costs three months and five million dollars. A fundamental flaw in the architecture will cost you five years and hundreds of millions of dollars2.

Physical mistakes are costly. There's no undo and editing isn't simply a matter of pressing some keys; changes consume real, physical resources. You need enough wisdom and experience to avoid common mistakes entirely – especially the ones that can't be fixed.
techtariat  comparison  software  hardware  programming  engineering  nitty-gritty  realness  roots  explanans  startups  tech  sv  the-world-is-just-atoms  examples  stories  economics  heavy-industry  hard-tech  cs  IEEE  oceans  trade  korea  asia  recruiting  britain  anglo  expert-experience  growth-econ  world  developing-world  books  recommendations  intricacy  dan-luu  age-generation  system-design  correctness  metal-to-virtual  psycho-atoms  move-fast-(and-break-things)  kumbaya-kult 
june 2019 by nhaliday
performance - What is the difference between latency, bandwidth and throughput? - Stack Overflow
Latency is the amount of time it takes to travel through the tube.
Bandwidth is how wide the tube is.
The amount of water flow will be your throughput

Vehicle Analogy:

Container travel time from source to destination is latency.
Container size is bandwidth.
Container load is throughput.

--

Note, bandwidth in particular has other common meanings, I've assumed networking because this is stackoverflow but if it was a maths or amateur radio forum I might be talking about something else entirely.
q-n-a  stackex  programming  IEEE  nitty-gritty  definition  jargon  network-structure  metrics  speedometer  time  stock-flow  performance  latency-throughput  amortization-potential  thinking 
may 2019 by nhaliday
Teach debugging
A friend of mine and I couldn't understand why some people were having so much trouble; the material seemed like common sense. The Feynman Method was the only tool we needed.

1. Write down the problem
2. Think real hard
3. Write down the solution

The Feynman Method failed us on the last project: the design of a divider, a real-world-scale project an order of magnitude more complex than anything we'd been asked to tackle before. On the day he assigned the project, the professor exhorted us to begin early. Over the next few weeks, we heard rumors that some of our classmates worked day and night without making progress.

...

And then, just after midnight, a number of our newfound buddies from dinner reported successes. Half of those who started from scratch had working designs. Others were despondent, because their design was still broken in some subtle, non-obvious way. As I talked with one of those students, I began poring over his design. And after a few minutes, I realized that the Feynman method wasn't the only way forward: it should be possible to systematically apply a mechanical technique repeatedly to find the source of our problems. Beneath all the abstractions, our projects consisted purely of NAND gates (woe to those who dug around our toolbox enough to uncover dynamic logic), which outputs a 0 only when both inputs are 1. If the correct output is 0, both inputs should be 1. The input that isn't is in error, an error that is, itself, the output of a NAND gate where at least one input is 0 when it should be 1. We applied this method recursively, finding the source of all the problems in both our designs in under half an hour.

How To Debug Any Program: https://www.blinddata.com/blog/how-to-debug-any-program-9
May 8th 2019 by Saketh Are

Start by Questioning Everything

...

When a program is behaving unexpectedly, our attention tends to be drawn first to the most complex portions of the code. However, mistakes can come in all forms. I've personally been guilty of rushing to debug sophisticated portions of my code when the real bug was that I forgot to read in the input file. In the following section, we'll discuss how to reliably focus our attention on the portions of the program that need correction.

Then Question as Little as Possible

Suppose that we have a program and some input on which its behavior doesn’t match our expectations. The goal of debugging is to narrow our focus to as small a section of the program as possible. Once our area of interest is small enough, the value of the incorrect output that is being produced will typically tell us exactly what the bug is.

In order to catch the point at which our program diverges from expected behavior, we must inspect the intermediate state of the program. Suppose that we select some point during execution of the program and print out all values in memory. We can inspect the results manually and decide whether they match our expectations. If they don't, we know for a fact that we can focus on the first half of the program. It either contains a bug, or our expectations of what it should produce were misguided. If the intermediate state does match our expectations, we can focus on the second half of the program. It either contains a bug, or our understanding of what input it expects was incorrect.

Question Things Efficiently

For practical purposes, inspecting intermediate state usually doesn't involve a complete memory dump. We'll typically print a small number of variables and check whether they have the properties we expect of them. Verifying the behavior of a section of code involves:

1. Before it runs, inspecting all values in memory that may influence its behavior.
2. Reasoning about the expected behavior of the code.
3. After it runs, inspecting all values in memory that may be modified by the code.

Reasoning about expected behavior is typically the easiest step to perform even in the case of highly complex programs. Practically speaking, it's time-consuming and mentally strenuous to write debug output into your program and to read and decipher the resulting values. It is therefore advantageous to structure your code into functions and sections that pass a relatively small amount of information between themselves, minimizing the number of values you need to inspect.

...

Finding the Right Question to Ask

We’ve assumed so far that we have available a test case on which our program behaves unexpectedly. Sometimes, getting to that point can be half the battle. There are a few different approaches to finding a test case on which our program fails. It is reasonable to attempt them in the following order:

1. Verify correctness on the sample inputs.
2. Test additional small cases generated by hand.
3. Adversarially construct corner cases by hand.
4. Re-read the problem to verify understanding of input constraints.
5. Design large cases by hand and write a program to construct them.
6. Write a generator to construct large random cases and a brute force oracle to verify outputs.
techtariat  dan-luu  engineering  programming  debugging  IEEE  reflection  stories  education  higher-ed  checklists  iteration-recursion  divide-and-conquer  thinking  ground-up  nitty-gritty  giants  feynman  error  input-output  structure  composition-decomposition  abstraction  systematic-ad-hoc  reduction  teaching  state  correctness  multi  oly  oly-programming  metabuch  neurons  problem-solving  wire-guided  marginal  strategy  tactics  methodology  simplification-normalization 
may 2019 by nhaliday
Is the human brain analog or digital? - Quora
The brain is neither analog nor digital, but works using a signal processing paradigm that has some properties in common with both.
 
Unlike a digital computer, the brain does not use binary logic or binary addressable memory, and it does not perform binary arithmetic. Information in the brain is represented in terms of statistical approximations and estimations rather than exact values. The brain is also non-deterministic and cannot replay instruction sequences with error-free precision. So in all these ways, the brain is definitely not "digital".
 
At the same time, the signals sent around the brain are "either-or" states that are similar to binary. A neuron fires or it does not. These all-or-nothing pulses are the basic language of the brain. So in this sense, the brain is computing using something like binary signals. Instead of 1s and 0s, or "on" and "off", the brain uses "spike" or "no spike" (referring to the firing of a neuron).
q-n-a  qra  expert-experience  neuro  neuro-nitgrit  analogy  deep-learning  nature  discrete  smoothness  IEEE  bits  coding-theory  communication  trivia  bio  volo-avolo  causation  random  order-disorder  ems  models  methodology  abstraction  nitty-gritty  computation  physics  electromag  scale  coarse-fine 
april 2018 by nhaliday
Static electricity - Wikipedia
Electrons can be exchanged between materials on contact; materials with weakly bound electrons tend to lose them while materials with sparsely filled outer shells tend to gain them. This is known as the triboelectric effect and results in one material becoming positively charged and the other negatively charged. The polarity and strength of the charge on a material once they are separated depends on their relative positions in the triboelectric series. The triboelectric effect is the main cause of static electricity as observed in everyday life, and in common high-school science demonstrations involving rubbing different materials together (e.g., fur against an acrylic rod). Contact-induced charge separation causes your hair to stand up and causes "static cling" (for example, a balloon rubbed against the hair becomes negatively charged; when near a wall, the charged balloon is attracted to positively charged particles in the wall, and can "cling" to it, appearing to be suspended against gravity).
nibble  wiki  reference  article  physics  electromag  embodied  curiosity  IEEE  dirty-hands  phys-energy  safety  data  magnitude  scale 
november 2017 by nhaliday
What are the Laws of Biology?
The core finding of systems biology is that only a very small subset of possible network motifs is actually used and that these motifs recur in all kinds of different systems, from transcriptional to biochemical to neural networks. This is because only those arrangements of interactions effectively perform some useful operation, which underlies some necessary function at a cellular or organismal level. There are different arrangements for input summation, input comparison, integration over time, high-pass or low-pass filtering, negative auto-regulation, coincidence detection, periodic oscillation, bistability, rapid onset response, rapid offset response, turning a graded signal into a sharp pulse or boundary, and so on, and so on.

These are all familiar concepts and designs in engineering and computing, with well-known properties. In living organisms there is one other general property that the designs must satisfy: robustness. They have to work with noisy components, at a scale that’s highly susceptible to thermal noise and environmental perturbations. Of the subset of designs that perform some operation, only a much smaller subset will do it robustly enough to be useful in a living organism. That is, they can still perform their particular functions in the face of noisy or fluctuating inputs or variation in the number of components constituting the elements of the network itself.
scitariat  reflection  proposal  ideas  thinking  conceptual-vocab  lens  bio  complex-systems  selection  evolution  flux-stasis  network-structure  structure  composition-decomposition  IEEE  robust  signal-noise  perturbation  interdisciplinary  graphs  circuits  🌞  big-picture  hi-order-bits  nibble  synthesis 
november 2017 by nhaliday
What kills, current or voltage? - Quora
Its an oversimplification to say that voltage kills or current kills and the cause of much misunderstanding.
 
You cannot have one without the other, therefore one could claim that answering one or the other is correct.
 
However, it is the current THROUGH key parts of the body that can be lethal. BUT even a lot of Current in a wire won't harm you if the current is constrained to the wire..
 
Specifically in order for you to be electrocuted, the voltage must be high enough to drive a lethal amount of current through your body over coming body resistance, the voltage must be applied in the right places so that the current path is through your (usually) heart muscle, and it must be long enough duration to stop the heart muscle due to fibrillation.

another good answer:
As I write this note, I’m looking at the textbook Basic Engineering Circuit Analysis by Irwin and Nelms (ISBN 0-471-48728-7). On page 449 the authors reference the work of Dr. John G. Webster who suggests the body resistance values that I have taken the liberty to sketch into this poor character.
nibble  q-n-a  qra  physics  electromag  dirty-hands  embodied  safety  short-circuit  IEEE  data  objektbuch  death 
september 2017 by nhaliday
electricity - Why is AC more "dangerous" than DC? - Physics Stack Exchange
One of the reasons that AC might be considered more dangerous is that it arguably has more ways of getting into your body. Since the voltage alternates, it can cause current to enter and exit your body even without a closed loop, since your body (and what ground it's attached to) has capacitance. DC cannot do that. Also, AC is quite easily stepped up to higher voltages using transformers, while with DC that requires some relatively elaborate electronics. Finally, while your skin has a fairly high resistance to protect you, and the air is also a terrific insulator as long as you're not touching any wires, sometimes the inductance of AC transformers can cause high-voltage sparks that break down the air and I imagine can get through your skin a bit as well.

Also, like you mentioned, the heart is controlled by electric pulses and repeated pulses of electricity can throw this off quite a bit and cause a heart attack. However, I don't think that this is unique to alternating current. I read once about an unfortunate young man that was learning about electricity and wanted to measure the resistance of his own body. He took a multimeter and set a lead to each thumb. By accident or by stupidity, he punctured both thumbs with the leads, and the small (I imagine it to be 9 V) battery in the multimeter caused a current in his bloodstream, and he died on the spot. So maybe ignorance is more dangerous than either AC or DC.
nibble  q-n-a  overflow  physics  electromag  dirty-hands  embodied  safety  short-circuit  IEEE  death 
september 2017 by nhaliday
Battle for the Planet of Low-Hanging Fruit | West Hunter
Peter Chamberlen the elder [1560-1631] was the son of a Huguenot surgeon who had left France in 1576. He invented obstetric forceps , a surgical instrument similar to a pair of tongs, useful in extracting the baby in a  difficult birth.   He, his brother, and  his brother’s descendants preserved and prospered from their private technology for 125 years. They  went to a fair amount of effort to preserve the secret: the pregnant patient was blindfolded, and all others had to leave the room.  The Chamberlens specialized in difficult births  among the rich and famous.
west-hunter  scitariat  discussion  history  early-modern  mostly-modern  stories  info-dynamics  science  meta:science  technology  low-hanging  fourier  europe  germanic  IEEE  ideas  the-trenches  alt-inst  discovery  innovation  open-closed 
may 2017 by nhaliday
Convex Optimization Applications
there was a problem in ACM113 related to this (the portfolio optimization SDP stuff)
pdf  slides  exposition  finance  investing  optimization  methodology  examples  IEEE  acm  ORFE  nibble  curvature  talks  convexity-curvature 
december 2016 by nhaliday

bundles : academeacmframe

related tags

abstraction  accretion  accuracy  acm  acmtariat  additive  advanced  advice  age-generation  aging  algebra  algorithmic-econ  algorithms  alt-inst  amazon  amortization-potential  AMT  analogy  analysis  anglo  apple  applicability-prereqs  applications  approximation  arbitrage  arrows  art  article  asia  assembly  atoms  audio  backup  benchmarks  best-practices  better-explained  big-list  big-picture  bio  bits  blog  books  britain  business  c(pp)  caching  calculation  career  carmack  causation  characterization  chart  cheatsheet  checking  checklists  chemistry  circuits  classic  client-server  coarse-fine  code-dive  code-organizing  coding-theory  commentary  communication  community  comparison  competition  complex-systems  complexity  composition-decomposition  compressed-sensing  compression  computation  computer-memory  concentration-of-measure  concept  conceptual-vocab  concrete  concurrency  confluence  consumerism  convexity-curvature  cool  correctness  cost-benefit  coupling-cohesion  course  critique  crypto  cs  culture  curiosity  curvature  dan-luu  dark-arts  data  data-science  data-structures  dbs  death  debugging  deep-learning  definition  degrees-of-freedom  desktop  developing-world  devops  dirty-hands  discovery  discrete  discussion  distributed  divide-and-conquer  diy  documentation  DP  driving  duality  dynamic  early-modern  earth  economics  editors  education  einstein  electromag  elegance  embedded  embodied  ems  encyclopedic  engineering  epistemic  ergodic  erik-demaine  error  error-handling  essay  estimate  europe  evolution  examples  existence  exocortex  expert-experience  explanans  explanation  exposition  extrema  feynman  finance  flux-stasis  form-design  forum  fourier  frequency  frontend  frontier  functional  game-theory  games  gedanken  germanic  giants  gnu  google  gowers  graphical-models  graphics  graphs  ground-up  growth-econ  guide  hacker  hard-tech  hardware  haskell  hci  heavy-industry  heavyweights  hi-order-bits  higher-ed  history  hn  homepage  howto  hsu  human-bean  human-capital  hypothesis-testing  ideas  identity  idk  IEEE  incentives  increase-decrease  info-dynamics  info-foraging  information-theory  init  innovation  input-output  integral  interdisciplinary  interface  interface-compatibility  internet  interview-prep  intricacy  intuition  investing  ios  iron-age  iteration-recursion  iterative-methods  jargon  javascript  jobs  keyboard  korea  kumbaya-kult  latency-throughput  latent-variables  learning  learning-theory  lecture-notes  legacy  lens  let-me-see  levers  libraries  linear-algebra  linear-programming  liner-notes  links  linux  list  local-global  low-hanging  machine-learning  magnitude  maker  management  marginal  marketing  markov  math  math.CA  math.CO  math.CV  math.DS  math.NT  mathtariat  matrix-factorization  measure  measurement  mechanics  mechanism-design  mediterranean  memory-management  meta:research  meta:science  metabuch  metal-to-virtual  methodology  metrics  microsoft  minimum-viable  mit  mobile  model-class  models  mostly-modern  motivation  move-fast-(and-break-things)  multi  multiplicative  nature  network-structure  networking  neuro  neuro-nitgrit  neurons  nibble  nitty-gritty  norms  nostalgia  numerics  objektbuch  occam  oceans  ocw  oly  oly-programming  oop  open-closed  optics  optimization  order-disorder  ORFE  org:anglo  org:bleg  org:edu  org:gov  org:med  os  oscillation  oss  overflow  p:null  p:someday  p:whenever  papers  parsimony  paste  pdf  people  performance  perturbation  photography  phys-energy  physics  pic  pls  poast  pragmatic  prepping  presentation  princeton  pro-rata  probability  problem-solving  productivity  prof  programming  properties  proposal  protocol-metadata  psycho-atoms  q-n-a  qra  questions  quixotic  quotes  rand-approx  random  rant  reading  realness  recommendations  recruiting  reduction  reference  reflection  regularity  repo  research  rhetoric  rigor  robust  roots  s:**  s:***  safety  scale  scaling-tech  sci-comp  science  scitariat  security  selection  sequential  series  shannon  short-circuit  signal-noise  simplification-normalization  sky  slides  smoothness  social  soft-question  software  space  sparsity  spatial  speed  speedometer  stackex  startups  state  static-dynamic  stats  stochastic-processes  stock-flow  stories  strategy  stream  street-fighting  structure  summary  sv  symmetry  synthesis  system-design  systematic-ad-hoc  systems  tactics  tainter  talks  tcs  teaching  tech  technology  techtariat  terminal  the-classics  the-great-west-whale  the-trenches  the-world-is-just-atoms  thermo  thinking  thurston  tidbits  time  time-complexity  tip-of-tongue  top-n  trade  trends  tricks  trivia  tutorial  twitter  types  ui  unit  unix  ux  video  virtualization  visual-understanding  visualization  visuo  volo-avolo  von-neumann  waves  web  west-hunter  wiki  wire-guided  workflow  working-stiff  world  worrydream  writing  yoga  🌞  👳  🔬  🖥 

Copy this bookmark:



description:


tags: