Please enable JavaScript.
Coggle requires JavaScript to display documents.
Equalization - short reach, MLstructures, ML for OFCS, MLs - Coggle Diagram
-
MLstructures
Feed Forward NN
- FFNNs are a fundamental type of NN architecture where information flows in one direction, from input to output, without any feedback loops.
- They consist of an input layer, one or more hidden layers, and an output layer.
- Each neuron in the network computes a weighted sum of its inputs, applies a nonlinear activation function to the sum, and passes the result to the neurons in the next layer.
- Common activation functions include sigmoid, tanh, and rectified linear units (ReLU), which introduce nonlinearity to the network, enabling it to learn complex patterns in the data.
NN overview
- Neural networks (NNs) are described as universal function approximators because of their ability to represent and approximate any function, given a sufficiently large number of neurons and layers.
- This property stems from the nonlinear activation functions used in NNs, which allow them to capture complex relationships between input and output data.
- The structure of NNs, resembling interconnected neurons with weighted connections, allows them to adapt and learn from data through a process called training, where the weights are adjusted iteratively to minimize prediction errors.
-
-
-
-
ML for OFCS
Network types
-
-
-
-
Each type has unique constraints like cost, power, reach etc.
Key Challenges
Nonlinear fiber effects like Kerr effect cause power-dependent nonlinear distortions.
Ref1: Performance limits in optical communications due to fiber nonlinearity
Ref2: G.P.Agrawal, Nonlinear Fiber Optics
-
-
Coherent technologies, software-defined networking (SDN) etc. add complexity.
-
-
Role of MLs
-
-
Can aid network modeling, control and automation as complexity increases.
-
MLs
-
Categorize
-
-
ML algorithms can be categorized based on the problem type (regression or classification) and whether the data is labeled (supervised) or unlabeled (unsupervised).
-
Trend
Regression is more popular for physical layer problems, while classification is more common for network layer problems
-
Online vs Batch Learning
-
Online learning continuously updates the model with new data as it arrives.
For neural networks, online learning tunes the weights with new labeled examples.
Unsupervised online methods exist for tasks like clustering/compression.