본문 바로가기

분류 전체보기

Transport Layer (1/3) Goal : 1. Understand principles behind transport layer services such as demultiplexing, multiplexing, reliable data transfer(TCP), flow control, and congestion control. 2. Learn about Internet transport layer protocols. (TCP, UDP) Chapter 3.1) Transport Layer Services Chapter 3.2) Multiplexing and Demultiplexing Chapter 3.3) Connectionless Transport (UDP) Chapter 3.4) Principles of Reliable Data.. 더보기
Application Layer (3/3) Socket Programming - Now that we've looked at a number of important network applications, let's explore how network application programs are actually created. - When a client program and server program are executed, a client process and a server process are created, and these processes communicate with each other by reading from, and writing to, sockets. - There are two types of network applicat.. 더보기
Linear Sorting Comparison Sort - All the sorting algorithms we have seen so far are comparison sorts, which only use comparisons to determine the relative order of elements. - Such examples are Insertion Sort, Merge Sort, Quick Sort, and Heap Sort. - The best worst-case running time that we've seen for comparison sorting is O(NlogN). - A decision tree can model the execution of any comparison sort. - The tree .. 더보기
Application Layer (2/3) Case Study 3) P2P Applications - While the previous examples such as the Web, E-mail, and DNS all employ client-server architectures, P2P architecture operates differently. - In P2P architecture, there is minimal or no reliance on always-on infrastructure servers, but instead have pairs of intermittently connected hosts called Peers. Distribution Time - Let's consider an example of distributing .. 더보기
Application Layer (1/3) Goal : 1. Learn about the conceptual, implementation aspects of network application protocols (Transport-layer service models, client-server paradigm, Peer-to-Peer Paradigm, Content Distribution Networks) 2. Learn about protocols by examining popular application-level protocols (HTTP, FTP, SMTP, POP3, IMAP, DNS) 3. Create Network Applications with Socket API Chapter 2.1) Principles of Network Ap.. 더보기
Reformer / Longformer : The Efficient Transformer & The Long Document 1. 논문 제목 : Reformer / Longformer : The Efficient Transformer & The Long Document 2. 발표한 학회 : ICLR 2020 / - 3. 논문의 핵심 키워드 : Quadratic Complexity with sequence length, Limitation to length, Sparse Attention, Locality Sensitive Hashing, Sliding Window Attention, Global Attention 4. 논문요약 : Transformers 기반의 모델들은 RNN 기반의 모델들과 비교하여서 model_dim이 차수가 줄어든 반면, sequence length의 차수가 증가하였다. 이는 512, 1024와 같이 정해.. 더보기
Pay Attention to MLPs 1. 논문 제목 : Pay Attention to MLPs 2. 발표한 학회 : - 3. 논문의 핵심 키워드 : FeedForward Nature, Inductive Bias, Static Parameterization, Spatial Projection 4. 논문요약 : Transformers가 크게 성공한데 기여한 요인이 Multi-head Self-Attention인지, Feedforward Layer인지는 여전히 불명확하였다. 저자들은 Feedforward Layer이 미치는 영향이 클 수 있다는 점에 착안하여서, Channel Projection 외에도 Attention과 비슷한 효과를 줄 수 있는 Spatial Projection을 사용한 Spatial Gating Unit를 제안하였다. ML.. 더보기
BIGBIRD : Transformers for Longer Sequences 1. 논문 제목 : BIGBIRD : Transformers for Longer Sequences 2. 발표한 학회 : NEURIPS 2020 3. 논문의 핵심 키워드 : Quadratic Dependency, Full-Attention Mechanism, Graph Sparsification Problem, Sparse Attention, Universal Approximator, Turing Completeness 4. 논문요약 : Transformer, BERT, GPT 이후의 많은 후속 논문들이 새로운 Pretraining 방법을 통해서 더욱 성능이 좋은 모델을 만들기 위해 노력을 들였다. 반면, Transformer의 (sequence length)^2의 time complexity를 문제점으로.. 더보기