Skip to content

Add positional encoding transforms#509

Open
yliu2-sc wants to merge 8 commits intomainfrom
yliu2/positional_encoding_transform
Open

Add positional encoding transforms#509
yliu2-sc wants to merge 8 commits intomainfrom
yliu2/positional_encoding_transform

Conversation

@yliu2-sc
Copy link
Collaborator

@yliu2-sc yliu2-sc commented Feb 24, 2026

Scope of work done

  • Add positional encoding PyG style transformations
  • Add unit tests for the transforms & utils
  • Including
    • random walk PE
    • random walk SE
    • hop distance encoding

Where is the documentation for this feature?: N/A

Did you add automated tests or write a test plan?

Updated Changelog.md? NO

Ready for code review?: NO

yliu2-sc and others added 3 commits February 23, 2026 12:54
Use getattr with default None instead of direct attribute access, which
raises AttributeError on NodeStorage objects without an x attribute.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

if num_nodes == 0:
for node_type in data.node_types:
data[node_type][self.attr_name] = torch.zeros(
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's currently attached as separate attributed to each node type, we could also have it attach to x by feeding in different parameters to add_node_attr(data, pe, self.attr_name=None) . Can add that function if needed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant