ϟ

Y. Lee

Here are all the papers by Y. Lee that you can download and read on OA.mg.
Y. Lee’s last known institution is . Download Y. Lee PDFs here.

Claim this Profile →
DOI: 10.1209/epl/i2000-00464-8
2000
Cited 50 times
Scale-invariant truncated Lévy process
We develop a scale-invariant truncated Lévy (STL) process to describe physical systems characterized by correlated stochastic variables. The STL process exhibits Lévy stability for the distribution, and hence shows scaling properties as commonly observed in empirical data; it has the advantage that all moments are finite and so accounts for the empirical scaling of the moments. To test the potential utility of the STL process, we analyze financial data.
DOI: 10.1109/ijcnn.1991.155499
2002
Pre-segmented handwritten digit recognition using neural networks
Summary form only given. Results of current research suggest that multilayer neural networks with local 'receptive fields' and shared weights can be applied successfully to presegmented handwritten digit recognition. It was demonstrated that handwritten digit recognition without segmentation problem is actually quite simple; even traditional techniques such as the k nearest neighbor (KNN) classifier can provide good performance. Back-propagation, radial basis function (RBF) networks, and KNN classifiers all provide similar low error rates on a large presegmented handwritten digit database. The effectiveness of these classifiers 'confidence' was also evaluated. The back-propagation network uses less memory and provides faster classification but can provide 'false positive' classifications when the input is not a digit. The RBF network generates a more effective confidence judgement for rejecting ambiguous inputs when high accuracy is warranted. The KNN classifier requires a prohibitively large amount of memory and is much slower at classification, yet has surprisingly good performance.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>