#ai #language
Created at 2022-12-20
# [Anonymous feedback](https://www.admonymous.co/louis030195)
# [[Epistemic status]]
#shower-thought #to-digest
# Related
- [[Readwise/Articles/pytorch-metric-learning - The Easiest Way to Use Deep Metric Learning in Your Application. Modular - Flexible - and Extensible. Written in PyTorch.]]
- [[Computing/Intelligence/Software 3.0]]
- [[Computing/Intelligence/Machine Learning/Fine tuning GPT3 on your second brain]]
- [[Computing/Intelligence/Machine Learning/Criterion of demarcation in AI]]
- [[Computing/Intelligence/Machine Learning/GPT3]]
# TODO
> [!TODO] TODO
# When you understand better programming languages than human languages
Here is a new hack I found with [[Artificial intelligence|AI]] to learn things:
Sometimes I don't get papers/explanations with merely a math equation, I understand a lot better with code
Math & Python are two languages, just as French and Cow are another ones.
I just ask GPT3 (or any #llm) to explain the thing in python language
Example prompt:
Implementation of Huber Loss in Python
```py
import numpy as np
def huber_loss(y_true, y_pred, delta=1.0):
error = y_true - y_pred
abs_error = np.abs(error)
# Quadratic part
quadratic_part = np.where(abs_error <= delta, 0.5 * np.square(error), delta * (abs_error - 0.5 * delta))
# Linear part
linear_part = np.where(abs_error > delta, delta * (abs_error - 0.5 * delta), 0.5 * np.square(error))
# Huber Loss
huber_loss = quadratic_part + linear_part
return huber_loss
```
Now I can ask the mathematical equation in order to understand the translation:
Huber Loss LaTex Equation
$L_{\delta}(a) =
\begin{cases}
\frac{1}{2}a^2 & \text{if } |a| \leq \delta \\
\delta \left( |a| - \frac{1}{2}\delta \right) & \text{otherwise}
\end{cases}$