Yahoo Web Search

Search results

  1. Mar 22, 2024 · Michael Mahoney is a professor of statistics and data science at UC Berkeley, with research interests in big data, machine learning, and scientific computing. He is also the director of the International Computer Science Institute and the Foundations of Data Analysis Institute, and has won several awards and honors for his work.

    • Home

      I am in the Department of Statistics at UC Berkeley; I am...

    • Research

      Overview. The main focus of my work is on algorithmic and...

    • Publications

      Recent and Upcoming Developments in Randomized Numerical...

    • Talks

      Talks and Presentations Recent tutorial presentations: ....

    • Teaching

      Michael Mahoney - Teaching. Classes. Spring 2018: Linear...

    • Stat89a

      Instructor: Michael Mahoney Term: Spring 2018. Time and...

    • CDSS at UC Berkeley

      As Principal Investigator Michael Mahoney (link is external)...

    • Foda Institute

      FODA Institute PIs/coPIs: Michael Mahoney (Director), Bin...

  2. Michael “Mike” Mahoney is Chief Executive Officer of Boston Scientific Corporation and Chairman of the company’s Board of Directors. Boston Scientific is a global medical technology leader with approximately $14.2 billion in annual revenue and commercial representation in more than 140 countries.

  3. Articles 1–20. ‪Professor of Statistics, UC Berkeley‬ - ‪‪Cited by 33,584‬‬ - ‪Algorithms‬ - ‪Statistics‬ - ‪Linear Algebra‬ - ‪Data Analysis‬ - ‪Machine Learning‬.

  4. www.forbes.com › profile › michael-mahoneyMichael Mahoney - Forbes

    Michael Mahoney has served as Boston Scientific's president and CEO since 2012. In 2016 he became its chairman. Mahoney was also the Worldwide Chairman of the Medical Devices and Diagnostics ...

    • 2024
    • 2023
    • 2022
    • 2021
    • 2020
    • 2019
    • 2018
    • 2017
    • 2016
    • 2015
    AI and Memory Wall,
    Using Uncertainty Quantification to Characterize and Improve Out-of-Domain Learning for PDEs,
    Chronos: Learning the Language of Time Series,
    Data-Efficient Operator Learning via Unsupervised Pretraining and In-Context Learning,
    Multi-scale Local Network Structure Critically Impacts Epidemic Spread and Interventions,
    An LLM Compiler for Parallel Function Calling,
    Temperature Balancing, Layer-wise Weight Analysis, and Neural Network Training,
    DMLR: Data-centric Machine Learning Research -- Past, Present and Future,
    Gated Recurrent Neural Networks with Weighted Time-Delay Feedback,
    Fully Stochastic Trust-Region Sequential Quadratic Programming for Equality-Constrained Optimization Problems,
    Randomized Numerical Linear Algebra: A Perspective on the Field With an Eye to Software,
    Monotonicity and Double Descent in Uncertainty Estimation with Gaussian Processes,
    Learning from learning machines: a new generation of AI technology to meet the needs of science,
    Long Expressive Memory for Sequence Modeling,
    Noisy Feature Mixup,
    Inexact Newton-CG Algorithms With Complexity Guarantees,
    Sparse sketches with small inversion bias,
    HAWQV3: Dyadic Neural Network Quantization,
    A Statistical Framework for Low-bitwidth Training of Deep Neural Networks,
    Training Recommender Systems at Scale: Communication-Efficient Model and Data Parallelism,
    PyHessian: Neural Networks Through the Lens of the Hessian,
    Exact expressions for double descent and implicit regularization via surrogate random design,
    LSAR: Efficient Leverage Score Sampling Algorithm for the Analysis of Big Time Series Data,
    HAWQ-V2: Hessian Aware trace-Weighted Quantization of Neural Networks,
    Trust Region Based Adversarial Attack on Neural Networks,
    Parameter Re-Initialization through Cyclical Batch Size Schedules,
    On the Computational Inefficiency of Large Batch Sizes for Stochastic Gradient Descent,
    The Mathematics of Data,
    Lectures on Randomized Numerical Linear Algebra,
    Avoiding Synchronization in First-Order Methods for Sparse Convex Optimization,
    Rethinking generalization requires revisiting old ideas: statistical mechanics approaches and complex learning behavior,(click herefor a blog about this paper)
    LASAGNE: Locality And Structure Aware Graph Node Embedding,
    Avoiding communication in primal and dual block coordinate descent methods,
    Feature-distributed sparse regression: a screen-and-clean approach,
    Multi-label learning with semantic embeddings,
    Mapping the Similarities of Spectra: Global and Locally-biased Approaches to SDSS Galaxy Data,
    Faster Parallel Solver for Positive Linear Programs via Dynamically-Bucketed Selective Coordinate Descent,
    A Local Perspective on Community Structure in Multilayer Networks,
    Optimal Subsampling Approaches for Large Sample Linear Regression,
    Unified Acceleration Method for Packing and Covering Problems via Diameter Reduction,
  5. Research Areas. Statistical Computing. Applications in the Physical and Environmental Sciences. Applications in the Social Sciences. High Dimensional Data Analysis.

  6. Michael W. Mahoney is at the University of California at Berkeley in the Department of Statistics and at the International Computer Science Institute (ICSI). He is also an Amazon Scholar as well as a faculty scientist at the Lawrence Berkeley National Laboratory.