Dear Israel,
as regards concepts it is difficult to see how they could evolve. Until Newton it was uncontested knowledge that force can only be conveyed by collision. Accordingly was his gravitation, which is action at a distance, initially ridiculed by some. I also don't see a viable trajectory from Newton's to Einstein's gravitation. Further, Huygens' superposition of unit waves explaining diffraction etc. can't be derived from then prevailing geometrical optics. Also from Huygens to Fresnel I see no change of principles, only refinement. Rather, if concepts were soft, I believe that nothing at all could be observed. Models, on the other hand, are indeed soft and for this reason no laws of nature - just models. Let me explain in some more detail and bore you with a little bit of philosophy...
Irrespective whether model or natural law, any speculation beyond the given requires pre-knowledge. How otherwise could we express such speculation? The task then at hand is the weighing of such speculation in light of pre-knowledge. There was a 17th century philosopher, Baruch de Spinoza, who held the following: "A thing necessarily exists, if no cause or reason be granted preventing its existence." What he says is, that a thing exists, if it is non-contradictory. Popper said that we can accept a theory as long as it hasn't been falsified. Falsification, however, is contradiction. So, are Spinoza and Popper saying the same?
It is important to recall that Popper's theory of falsification (and Spinoza's assertion per se) is a theory over propositions (Sätze). "Only propositions can falsify propositions. Measurements, experiments and data as such have no place in Popper's theory, i.e. falsification is an armchair job. This means mere facts can neither verify nor falsify theories. The first stage of attempting the falsification of a theory consists according to Popper in its consistency check, which I call the semantics-check. Can we reasonably speak about what the theory pretends to claim? In the case of quantum mechanics his conclusion was: no we can't, unless we consider statistical quantum ensembles in a frequentist's sense. Well, it doesn't really need super-intelligence to find that any other traditional interpretation is a-semantic, thus failing the pre-test. Also recent QM and relativity interpretations/extensions would not pass the semantic test and hence end up dead in the water even before reaching the core of the falsification procedure, the weighing of two semantically consistent theories against each other. I guess Spinoza would buy this.
But then there is a difference: the important word in Spinoza's assertion that things exist "if no cause or reason be granted..." is the word NO! whereas Popper suggests falsification against so called Basissätze, i.e. a set of fundamental propositions that is believed to be 'true'. This is where Popper changes over from falsification to affirmation, which is super-critical not because the chosen set may contain false propositions, but because it reduces the totality of human knowledge down to almost nothing and entirely excludes human experience in the world, i.e. the phenomena. This is what Spinoza would have rejected, for other knowledge domains (e.g. biology) and the phenomena make up for an estimated 99.9% of our daily propositions.
What Popper effectively proposed is the minimization of pre-knowledge and physicists increasingly began to substitute propositions by facts (data), which eventually led to the outcry of physicists 'we cannot possibly falsify our theories', while they actually meant 'we cannot possibly falsify our data'. Now, empiricism is a posteriori rule-making over data censored by the phenomena, i.e. modeling (e.g. of the climate), which is to be distinguished from a priori (Spinozean or Kantian) laws of nature, which value the whole of knowledge and experience. In the absence of phenomena, which is the case for all of theoretical physics, the modeling turns into pseudo-empiricism, for data are no empirical evidence of whatsoever..
the best for your essay,
Heinz