✍️
Today IL
  • Today I learned!
  • Deployment
    • Rolling, Canary, Blue-green deployments
    • Kubernetees Helm Charts
  • AI/ML
    • SeldonIO
    • Installing software in E2E cloud compute
    • Watching nvidia-smi
    • How does github copilot works?
    • composer library
    • Better to pass callback in fit_one_cycle
    • Eliza - demo
    • Helsinki-NLP Translation models
  • Fastai Learning
  • Python
    • Understanding get_image_files in fastai
    • Resizing an image to smaller size
    • Extracting a Json Object as List(regex)
    • f-strings debugging shortcut
    • Pytest
    • conda switch between python versions
    • Nested functions exception handling
  • Programming
    • Installing Linux Operating system
    • Certbots usage
    • Code highlighting in Google Docs
    • HTTP Methods
    • How to use vertical mouse?
    • HTTP Status Codes
    • Keycloak, Oauth, OpenID connect, SAML
    • Why should NPM packages be as small as possible?
    • Clean Architecture
    • REST vs gRPC
    • Keycloak concepts
    • what is proxy server and nginx?
    • Asymptotic Time Complexity
  • async/await
    • JavaScript Asynchronous operation
    • Lesson 2- Eventloops
    • Lesson 1- asyncio history
    • Lesson 3- using coroutines
    • Lesson 4- coroutines in hood
    • Python async/await
    • JavaScript
  • R Programming language
    • Facet_grid and Facet_wrap
    • geom_point
  • C programming language
    • Inputting String in C programming language
    • Checking if a element is passed as input or not?
  • Git/Github
    • give credits to other people
    • one time setting to easily use Github
    • Checkout to specific tag
    • git suggestions in PR
    • Using emojis in git commits
  • Databases
    • Postgres Database Dockercompose
    • TIL New SQL Operators - Except, UNION, Distinct
    • Analysing Performance of DB Queries
    • Querying Date ranges in postgres
    • Handling Database disconnects in SQLAlchemy
  • WITH NO EXCEPT
  • What is difference with JSON documents in postgres and Mongodb
Powered by GitBook

Fastai Learning

PreviousHelsinki-NLP Translation modelsNextUnderstanding get_image_files in fastai

Last updated 2 years ago

Was this helpful?

Having a low training loss but high validation loss does not necessarily mean that your model is already overfitting. This is a point of confusion for most of us when starting out and trying to understand these ideas.

As mentioned in the book; when the validation set accuracy stops getting better, and instead goes in the opposite direction and gets worse, that is when you can start thinking that the model is starting to memorize the training set rather than generalize, and has started to overfit. Taken straight from the book, Chapter 1 :

image

In other words, as you progress further on the training, once the validation loss starts getting consistently worse than what it was before, (and hence meaning that it’s not able to generalize as well to unseen data, while getting better on training data) then the model has started to overfit.

Not to be taken as an exact example, but I’ve tried to quickly sketch out what this could mean in practice.

and metrics sees no improvement

Source: https://forums.fast.ai/t/lesson-2-official-topic/96033/226?u=kurianbenoy

image