-
Notifications
You must be signed in to change notification settings - Fork 843
Avoid deadlock between Put and Get #109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There is a problem where the batch channel fills up and Puts hold the lock preventing Gets from catching up.
+1 |
for i, val := range b.items { | ||
cpItems[i] = val | ||
} | ||
go func() { b.batchChan <- cpItems }() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This wouldn't necessarily preserve the order of items coming in versus going out, as go routines could add the batches in any order once they get backed up, or really before too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we care about the ordering, but might want to document it.
+1 if we don't care about ordering per above comment |
for { | ||
select { | ||
case <-b.batchChan: | ||
case <-time.After(5 * time.Millisecond): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems like there could be a more intelligent/safer way to know if anyone is waiting to put stuff on the channel.
@brianshannan-wf @tylertreat-wf Comments addressed. @alexandercampbell-wf Can you review the most recent commits? |
LGTM but I don't know much about this code. |
@@ -158,6 +165,11 @@ func (b *basicBatcher) Dispose() { | |||
b.flush() | |||
b.disposed = true | |||
b.items = nil | |||
|
|||
// Drain the batch channel and all routines waiting to put on the channel | |||
for len(b.batchChan) > 0 || atomic.LoadInt32(&b.waiting) > 0 { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure if you'll ever hit it, or if it matters, but two threads could have len(b.batchChan) evaluate to greater than 0 at the same time and if there was only one the other will be blocked on the next line.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We're safe in here because of the lock.
+1 |
1 similar comment
+1 |
Investigating regression. |
+1 |
@dustinhiatt-wf @brianshannan-wf @alexandercampbell-wf Fix deadlock in dispose. Ready for review. |
+1 |
1 similar comment
+1 |
Avoid deadlock between Put and Get
There is a problem where the batch channel fills up and Puts hold the lock preventing Gets from catching up.
@dustinhiatt-wf @brianshannan-wf @tylertreat @alexandercampbell-wf