Yahoo Web Search

Search results

  1. Mar 22, 2024 · Michael Mahoney is a professor of statistics and data science at UC Berkeley, with research interests in big data, machine learning, and scientific computing. He is also the director of the International Computer Science Institute and the Foundations of Data Analysis Institute, and has won several awards and honors for his work.

    • Home

      Most of my work focuses on the applied mathematics of data,...

    • Research

      Overview. The main focus of my work is on algorithmic and...

    • Publications

      Recent and Upcoming Developments in Randomized Numerical...

    • Talks

      Talks and Presentations Recent tutorial presentations: ....

    • Teaching

      Michael Mahoney - Teaching. Classes. Spring 2018: Linear...

    • Stat89a

      Instructor: Michael Mahoney Term: Spring 2018. Time and...

  2. Articles 1–20. ‪Professor of Statistics, UC Berkeley‬ - ‪‪Cited by 33,584‬‬ - ‪Algorithms‬ - ‪Statistics‬ - ‪Linear Algebra‬ - ‪Data Analysis‬ - ‪Machine Learning‬.

    • 2024
    • 2023
    • 2022
    • 2021
    • 2020
    • 2019
    • 2018
    • 2017
    • 2016
    • 2015
    AI and Memory Wall,
    Using Uncertainty Quantification to Characterize and Improve Out-of-Domain Learning for PDEs,
    Chronos: Learning the Language of Time Series,
    Data-Efficient Operator Learning via Unsupervised Pretraining and In-Context Learning,
    Multi-scale Local Network Structure Critically Impacts Epidemic Spread and Interventions,
    An LLM Compiler for Parallel Function Calling,
    Temperature Balancing, Layer-wise Weight Analysis, and Neural Network Training,
    DMLR: Data-centric Machine Learning Research -- Past, Present and Future,
    Gated Recurrent Neural Networks with Weighted Time-Delay Feedback,
    Fully Stochastic Trust-Region Sequential Quadratic Programming for Equality-Constrained Optimization Problems,
    Randomized Numerical Linear Algebra: A Perspective on the Field With an Eye to Software,
    Monotonicity and Double Descent in Uncertainty Estimation with Gaussian Processes,
    Learning from learning machines: a new generation of AI technology to meet the needs of science,
    Long Expressive Memory for Sequence Modeling,
    Noisy Feature Mixup,
    Inexact Newton-CG Algorithms With Complexity Guarantees,
    Sparse sketches with small inversion bias,
    HAWQV3: Dyadic Neural Network Quantization,
    A Statistical Framework for Low-bitwidth Training of Deep Neural Networks,
    Training Recommender Systems at Scale: Communication-Efficient Model and Data Parallelism,
    PyHessian: Neural Networks Through the Lens of the Hessian,
    Exact expressions for double descent and implicit regularization via surrogate random design,
    LSAR: Efficient Leverage Score Sampling Algorithm for the Analysis of Big Time Series Data,
    HAWQ-V2: Hessian Aware trace-Weighted Quantization of Neural Networks,
    Trust Region Based Adversarial Attack on Neural Networks,
    Parameter Re-Initialization through Cyclical Batch Size Schedules,
    On the Computational Inefficiency of Large Batch Sizes for Stochastic Gradient Descent,
    The Mathematics of Data,
    Lectures on Randomized Numerical Linear Algebra,
    Avoiding Synchronization in First-Order Methods for Sparse Convex Optimization,
    Rethinking generalization requires revisiting old ideas: statistical mechanics approaches and complex learning behavior,(click herefor a blog about this paper)
    LASAGNE: Locality And Structure Aware Graph Node Embedding,
    Avoiding communication in primal and dual block coordinate descent methods,
    Feature-distributed sparse regression: a screen-and-clean approach,
    Multi-label learning with semantic embeddings,
    Mapping the Similarities of Spectra: Global and Locally-biased Approaches to SDSS Galaxy Data,
    Faster Parallel Solver for Positive Linear Programs via Dynamically-Bucketed Selective Coordinate Descent,
    A Local Perspective on Community Structure in Multilayer Networks,
    Optimal Subsampling Approaches for Large Sample Linear Regression,
    Unified Acceleration Method for Packing and Covering Problems via Diameter Reduction,
  3. Michael “Mike” Mahoney is Chief Executive Officer of Boston Scientific Corporation and Chairman of the company’s Board of Directors. Boston Scientific is a global medical technology leader with approximately $14.2 billion in annual revenue and commercial representation in more than 140 countries.

  4. Michael W. Mahoney is at the University of California at Berkeley in the Department of Statistics and at the International Computer Science Institute (ICSI). He is also an Amazon Scholar as well as a faculty scientist at the Lawrence Berkeley National Laboratory.

  5. Michael W. Mahoney. Dynamical systems that evolve continuously over time are ubiquitous throughout science and engineering. Machine learning (ML) provides data-driven approaches to model and...

  6. Oct 18, 2020 · Michael Mahoney - Dynamical systems and machine learning. Prof. Michael Mahoney from UC Berkeley speaking in the Data-driven methods for science and engineering seminar. Recorded on...

    • 70 min
    • 3.6K
    • Physics Informed Machine Learning