# Contributed and Invited Presentations

##### The Undecidability of Network Coding With Some Fixed-Size Messages and Edges [virtual]

##### Enhancing the Decoding Rates of BATS Codes by Learning With Guided Information [virtual]

##### Covering Sequences for \(\ell\)-Tuples

##### Perfect Codes Correcting a Single Burst of Limited-Magnitude Errors

##### The Gapped \(k\)-Deck Problem

##### The DNA Storage Channel: Capacity and Error Probability Bounds

##### Log-CCDM: Distribution Matching via Multiplication-Free Arithmetic Coding

##### Centralised Multi Link Measurement Compression With Side Information

##### Universal Compression of Large Alphabets With Constrained Compressors [virtual]

##### Ternary Convolutional LDGM Codes With Applications to Gaussian Source Compression [virtual]

##### The Communication Value of a Quantum Channel

##### Singleton Bounds for Entanglement-Assisted Classical and Quantum Error Correcting Codes [virtual]

##### Communicating Over Classical-Quantum MAC With State Information Distributed at Senders

##### Analytical Calculation Formulas for Capacities of Classical and Classical-Quantum Channels [virtual]

##### Secure Distributed/Federated Learning: Prediction-Privacy Trade-Off for Multi-Agent System

##### SwiftAgg: Communication-Efficient and Dropout-Resistant Secure Aggregation for Federated Learning With Worst-Case Security Guarantees

##### Fundamental Limits of Personalized Federated Linear Regression With Data Heterogeneity

##### Social Learning Under Randomized Collaborations

##### On How to Avoid Exacerbating Spurious Correlations When Models are Overparameterized

##### Asymptotic Behavior of Adversarial Training in Binary Linear Classification

##### Understanding Deep Neural Networks Using Sliced Mutual Information

##### A Unified f-Divergence Framework Generalizing VAE and GAN

##### AoI in Source-Aware Preemptive M/G/1/1 Queueing Systems: Moment Generating Function

##### Query Age of Information: Optimizing AoI at the Right Time

##### Analysis of an Age-Dependent Stochastic Hybrid System

##### Performance Modeling of Scheduling Algorithms in a Multi-Source Status Update System

##### Efficient Representation of Large-Alphabet Probability Distributions via Arcsinh-Compander

##### A Tighter Approximation Guarantee for Greedy Minimum Entropy Coupling (JKW Award Finalist)

##### On Information-Theoretic Determination of Misspecified Rates of Convergence

##### Probability Distribution on Rooted Trees

##### Multichannel Optimal Tree-Decodable Codes are Not Always Optimal Prefix Codes

##### Error-Erasure Decoding of Linearized Reed-Solomon Codes in the Sum-Rank Metric

##### Fully Analog Noise-Resilient Dynamical Systems Storing Binary Sequence

##### Universal Decoding for the Typical Random Code and for the Expurgated Code

##### Capacity-Achieving Constrained Codes With GC-Content and Runlength Limits for DNA Storage [virtual]

##### On Homopolymers and Secondary Structures Avoiding, Reversible, Reversible-Complement and GC-Balanced DNA Codes

##### Capacity of the Shotgun Sequencing Channel [virtual]

Most DNA sequencing technologies are based on the shotgun paradigm: many short reads are obtained from random unknown locations in the DNA sequence. A fundamental question, studied in [1], is what read length and coverage depth (i.e., the total number of reads) are needed to guarantee reliable sequence reconstruction. Motivated by DNA-based storage, we study the coded version of this problem; i.e., the scenario in which the DNA molecule being sequenced is a codeword from a predefined codebook. Our main result is an exact characterization of the capacity of the resulting shotgun sequencing channel as a function of the read length and coverage depth. In particular, our results imply that while in the uncoded case, O(n) reads of length greater than 2logn are needed for reliable reconstruction of a length n binary sequence, in the coded case, only O(n/logn) reads of length greater than log n are needed for the capacity to be arbitrarily close to 1.

##### Finite-State Semi-Markov Channels for Nanopore Sequencing

##### Density Estimation of Processes With Memory via Donsker Vardhan

##### Matroidal Entropy Functions: Constructions, Characterizations and Representations [virtual]

##### Generalized Longest Repeated Substring Min-Entropy Estimator

##### Modeling Network Contagion via Interacting Finite Memory Polya Urns

##### Covert Communication in the Presence of an Uninformed, Informed, and Coordinated Jammer

##### Covert Communication With Mismatched Decoders

##### Towards a Characterization of the Covert Capacity of Bosonic Channels Under Trace Distance

We characterize upper and lower bounds for the covert capacity of lossy thermal-noise bosonic channels when measuring covertness using fidelity and trace distance. Although we fall short of characterizing the exact covert capacity, we also provide bounds on the number of secret-key bits required to achieve covertness. The bounds are established by combining recent quantum information theory results in separable Hilbert spaces, including position based coding (Oskouei et al., arXiv: 1804.08144), convex splitting (Khatri et al., arXiv: 1910.03883), and perturbation theory (Grace and Guha, arXiv: 2106.05533).