Skip to content

Add question 238: SimCLR Contrastive Loss (NT-Xent)#578

Open
BARALLL wants to merge 1 commit intoOpen-Deep-ML:mainfrom
BARALLL:new-question-238-contrastive
Open

Add question 238: SimCLR Contrastive Loss (NT-Xent)#578
BARALLL wants to merge 1 commit intoOpen-Deep-ML:mainfrom
BARALLL:new-question-238-contrastive

Conversation

@BARALLL
Copy link

@BARALLL BARALLL commented Dec 13, 2025

No description provided.

Copy link
Collaborator

@moe18 moe18 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good, had a few notes and I would limit the number of test cases to 10

@@ -0,0 +1,5 @@
{
"input": "# N=2 (Total batch 4).\n# 0 and 1 are views of Cat. 2 and 3 are views of Dog.\n# 0 matches 1 (Positive). 0 mismatches 2 and 3 (Negatives).\n\nz = np.array([\n [1.0, 0.0], # 0: Cat View A\n [1.0, 0.0], # 1: Cat View B (Perfect match with 0)\n [0.0, 1.0], # 2: Dog View A (Orthogonal to 0)\n [0.0, 1.0] # 3: Dog View B (Orthogonal to 0)\n])\ntemperature = 0.5",
"output": "2.2395447662218846",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should be .2395 not 2.395

@@ -0,0 +1,66 @@
[
{
"test": "print(nt_xent_loss([[1, 0], [1, 0], [0, 1], [0, 1]], 1.0))",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would round these results in the test case

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants