Laser-induced incandescence (LII) was used to derive temperatures of pulsed laser heated soot particles from their thermal emission intensities detected at two wavelengths in a laminar ethylene/air co-annular diffusion flame. The results are compared to those of a numerical nanoscale heat and mass transfer model. Both aggregate and primary particle soot size distributions were measured using transmission electron microscopy (TEM). The model predictions were numerically averaged over these experimentally derived size distributions. The excitation laser wavelength was 532 nm, and the LII signal was detected at 445 nm and 780 nm. A wide range of laser fluence from very low to moderate (0.13 to 1.56 mJ/mm2) was used in the experiments. A large part of the temporal decay curve, beginning 12–15 nsec after the peak of the laser excitation pulse, is successfully described by the model, resulting in the determination of accommodation coefficients, which varies somewhat with soot temperature and is in the range of 0.36 to 0.46. However, in the soot evaporative regime, the model greatly overpredicts the cooling rate shortly after the laser pulse. At lower fluences, where evaporation is negligible, the initial experimental cooling rates, immediately following the laser pulse, are anomalously high. Potential physical processes that could account for these effects are discussed. From the present data the soot absorption function, E(m), of 0.4 at 532 nm is obtained. A procedure for correcting the measured signals for the flame radiation is presented. It is further shown that accounting for the local gas temperature increase due to heat transfer from soot particles to the gas significantly improves the agreement in the temperature dependence of soot cooling rates between model and experiments over a large range of laser fluences.