WIP: add support for neighbours in loss computation in LearnerND
2 unresolved threads
2 unresolved threads
Closes #120
TODO add support to output in
TODO rewrite the code to be more readable, I will do this next week
As you can see in the plot, it is getting hard to distinguish the LearnerND from the Learner2D :D
Merge request reports
Activity
Cool!
Would it be possible to implement this in a similar way to what we did in !131 (merged)?
added 4 commits
-
ac0015b2...2b8d7492 - 3 commits from branch
master
- 3c90147a - add support for neighbours in loss computation in LearnerND
-
ac0015b2...2b8d7492 - 3 commits from branch
added 1 commit
- e35b130e - add support for neighbours in loss computation in LearnerND
606 601 """Simplices originating from a vertex don't overlap.""" 607 602 raise NotImplementedError 608 603 604 def get_neighbours_from_vertices(self, simplex): @Jorn I introduced this method but I am not sure about the name. Could you fix it if you know a better description?
changed this line in version 6 of the diff
added 1 commit
- 39ddfadd - add support for neighbours in loss computation in LearnerND
added 1 commit
- 2420c417 - add support for neighbours in loss computation in LearnerND
added 1 commit
- c771e7ab - add support for neighbours in loss computation in LearnerND
added 1 commit
- a339de54 - add support for neighbours in loss computation in LearnerND
added 1 commit
- c336a6e1 - add support for neighbours in loss computation in LearnerND
added 1 commit
- f53c378b - add support for neighbours in loss computation in LearnerND
- Resolved by Joseph Weston
80 at the boundary. 81 82 Returns 83 ------- 84 loss : float 85 """ 86 87 neighbors = [n for n in neighbors if n is not None] 88 if len(neighbors) == 0: 89 return 0 90 91 return sum(simplex_volume_in_embedding([*simplex, neighbour]) 92 for neighbour in neighbors) / len(neighbors) 93 94 def get_curvature_loss(curvature_factor=1, volume_factor=0, input_volume_factor=0.05): 95 # XXX: add doc-string! mentioned in issue #125 (closed)
Please register or sign in to reply