PARTICLE-SIZE-MEASURING IN THE NANOMETER-RANGE
SICO Scientific Industrial Consulting Office
by Dipl.Chem. Wolfgang Laemmle, SICO scientific industrial consulting office, former with SYMPATEC GmbH contact: sico@laemmle.org www.sico.cc
TECHNICAL TERM: NANO is an in many cases even in technical literature incorrectly used term. The Technical Committee 229 of ISO is deserving thanks for an exact definition with respect to particle size measurement : According to this definition only NANO-objects should be called NANO-particles if its three coordinate dimensions are all within the NANO-range of about 1 nm to 100 nm. Nanomaterial (one or more external dimensions or an internal or surface structure on the nanoscale) ISO/NP TS 12144 Nano-object (one or more external dimensions on the nanoscale ISO/TS 27687 Nanoparticle (3 ext. dmensions on the nanoscale)
Nanotubes (2 ext.
dimensions on the nanoscalei)
Nanoplates (1 ext.
dmension on the nanoscale)
Assembled Nanomaterial
Shellstructures
interaction of laser light with particles in the nanometre range Nanocomposite
single Nanotubes
free Nanoplates
Nanotubebundle
fumed silica
hollow Al-Shellstructures
Silicasolcomposite
Regrettably even today many scientific publications still use the NANO-term in very imprecise manners. One reason for this might be the difficulty to measure NANO-Particles sufficiently precise. Imprecise knowledge of dimensions often leads to inaccurate classification. In general can be stated that NANO-particles are smaller than the wavelength of visible light. Because of this particle size determination by technologies based on image interpretation cannot be applied by principle e.g. light microscopes, cameras etc. Merely light scattering effects can be applied, but those enable indirect information only. HISTORY: The most popular and best researched method, using light scattering, is Photon Correlation Spectroscopy PCS including its variations that observe scattered light under different angles. PCS-conferences started as informal discussions between members of different institutions in the late 1960ies known as the colloquiums of the ‘Correlator Club’. First PCS instruments were available from the late 1970ies manufactured by different companies. These early instruments needed a very robust and simple calculation mode, the ‘2nd Cumulant’ method, for evaluation because the calculation power of the then available computers were still very limited. Still today this simplified calculation method is positioned in the related ISO 13321 framework. It generates as results a mean particle diameter and a value for the width of an assumed Gaussian distribution only. Because of this, such early instruments were unable to describe multi modal distributions correctly and had a very limited applicability only. Increasing calculation power enabled the use of more sophisticated calculation methods for evaluation, e.g. the patent-registered CONTIN-Method got into focus. Later on this was improved further by still less damped methods like the ‚Non-Negative-Least-Square‘ NNLS method, in which an extraordinary instable set of equations demands extremely high calculation power. This fact is taken into account today by revision of the established outdated ISO 13321.
Standard evaluation mode (2nd Cumulant), ISO 13321 presumes and requires mono modularity and yields
median distribution diameter indication as to the width of distribution (width ± %)
Mean diameter: 195 nm Width (2nd Viral Coefficient): ± 10,2%
no information regarding internal structure of total distribution
NNLS (Non Negative Least Square) Selection of the best fitting window
determines the quality of the evaluation
allows for indication of the
structure within distribution
with correctly set fit-range reliable information on multiple populations in distribution is provided
Intensity I(t,Θ)
I(t, Θ)
t=0
NIST-Goldstandard
INSTRUMENTS: The only instrumental setup that is able to eliminate multiple scatter via cross correlation entirely and thus is able to measure fairly independent from given concentration, is Photon-Cross-Correlation-Spectroscopy PCCS. PCS PCCS
Brownian Motion
Nano-structured material (inner or surface structure on the nanoscale) ISO/WD TS 12921
Material with nanostructured surface
PRINCIPLE: Dynamic light scattering, as basic principle of the PCS method, makes use of the so called Brownian motion. This is based on the effect, described in 1905/1906 by Einstein and Smoluchowski, that particles in suspension are moved around by elastic pulsing of the surrounding liquid molecules and the dimension of the resulting movement is related to mass and volume of the particular particles. This movement creates a fluctuation of the scattered light intensity in the visual field of a stationary observer and the frequency of this fluctuation is directly related to the size of the moving particles.
t = ∆t
t = 2∆t
time t
the changing short-range order of the particles yields a fluctuating intensity distribution
the intensity related to the scattering angle Θ oscillates with time t
But this very direct relation is valid only while fulfilling the following restrictive conditions: 1. There have to be freely moving spherical particles only. Deviation from perfect sphericity is not decisive, as the particles are surrounded by an ionic shell, that will smooth out extreme forms anyway. Free movement on the other hand is a critical parameter because the ionic shells may interact when due to higher particle concentration the distance to each other is decreasing too far. 2. Unrestricted visibility of the moving particles respectively the emitted scattered light has to be granted. In case of high concentration and a detection area deeply inside the suspension the observation of the fluctuation of the scattered light intensity will be interfered by other particles located between detection area and observer. Multiple scatter will occurre that is capable of distorting the result by far. Particles with a real diameter of 100nm can be distortedly measured as small as some 20nm only, if a certain amount of multiple scatter is present. For this reason it had been necessary to measure at very small concentration only, to avoid this effect. Doing so resulted in a bad signal to noise ratio and potential modification of the sample due to dilution related changes in the ionic surrounding. In many cases this influence locked even this back door. Better solutions to overcome this problem had been tried by observation of the scattered light intensity fluctuation under different angles to gain clearer information and later also by applying a nearly backwards directed signal (168°) combined with a minimal depth of invasion, the so called ‘back scattering‘. The multiple angle measuring method has been abolished meanwhile because it could only be performed consecutively and always failed in case of fast changes. Further the evaluation of multiple results from different angles in principle is already quite complex and in addition strongly depending on the scattering capability of the sample. In contrary the back scatter method has become very popular and is realized in nearly all actual instruments today. This technology however neglects the fact, that while observing thin marginalized layers close to the cuvette wall the area of elastic pulsing is left. The essential basic principle of the Stokes-Einstein-relation and by that also of the PCS-measuring method, the interaction of particles and liquid molecules by elastic pulsing is increasingly overlaid by non elastic wall contacts the closer the observation area is located to the wall. There is no commonly known correction available for this kind of error.
By this principle the influence of the above mentioned disturbances and errors is entirely eliminated. Two different light beams of identical wave length and intensity are focused to an identical measuring volume and the fluctuation of the scattered light is detected each under 90°. Only the identical part in both measurements is separated by cross correlation and taken into account. Hence independent from concentration only unadulterated primary signals contribute to the result. This technology is further supported by exact and automated cuvette positioning for different types of cuvettes that also takes advantage of minimized depth of invasion but without exceeding it into the area of non elastic pulsing close to the wall. In addition the optimization of the raw signal is gained by use of a laser intensity dimmer that adjusts the yield of primarily scattered light to an optimum. Thus is warranted that widely independent from the sample property and from concentration the best possible yield of signal gained. Special cases of far to high concentration are easy to detect and don’t generate any results because in such cases after cross correlation nothing will be left for evaluation. After appropriate accurate evaluation the results will be either reliable or none but never wrong ones. RELEVANCE: Every modern particle size measuring instrument provides kind of smooth diagrams and the related data but only very seldom a hint for the relevance of the presented result. In many cases most simple automated handling as well as the possibility of additional information on e.g. Zeta-potential are the key points of advertisement. Even knowing that by necessary diluting the original suspension in most cases a change of the Zeta-potential is initiated. A hint for the relevance or correctness of the produced result is only sometimes given by a number for the accuracy of ‘Fit’. Much more important than the interpretation of the result is the careful inspection of the correlation function diagram which provides information if the result generated by the selected evaluation mode is matching the raw signal correlation diagram in a proper way. In case meaningful differences are obvious, the generated result has to be taken for a misinterpretation, directly eliminated and a more sophisticated evaluation mode should be applied. For this there are complex evaluation methods are available like e.g. NNLS method which can be used by everybody easily despite of its complexity. Even though this is contradicting the requirement of simplest handling it is the only way to generate reliable accurate results of highest resolution.
CONCLUSION: Promises made in high gloss brochures qualify themselves in application with real samples very often as mislead3. A basic rule in mechanical engineering tells that for the ing because they disregard above mentioned regards as well measurement of a particle size distribution with a standard de- in PRINZIPLE, in INSTRUMENTS as in RELEVANCE. viation of 1%, a number of at least 105 particles per size class is needed. It has to be doubted that such an amount of data and Increasingly the good news is spreading that nano measurement can by that reliable results can be achieved during extremely short be done today in a much more scientific and exact way even without but customer friendly measuring times under consideration of highly qualified employees and in that connection more and more elected time intervals with very smooth correlation functions often the name of NANOPHOX and SYMPATEC is mentioned. only. The theoretically necessary duration of a correct measurement related to a corresponding high number of events is 106 * decay time of the coarsest measured particle. Even more critical is to dare to elaborate a distribution result for an entire sample by tracing the movement of a few single particles only as propagated for a recently developed new instrument. Such extremely shortened measurements are as meaningless as e.g. results from electron microscopy that are also based on a limited number of particles only. The attractiveness of such measurements is based on the detailed presentation of single objects. With respect to the characterization of an entire sample such a method is not really suitable but in spite of this in many cases it is over-interpreted.