-
Notifications
You must be signed in to change notification settings - Fork 4k
Adls data plane task add debug functionality #5573
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
cormacpayne
merged 9 commits into
adls-data-plane
from
adls-data-plane-task-addDebugFunctionality
Mar 20, 2018
Merged
Changes from all commits
Commits
Show all changes
9 commits
Select commit
Hold shift + click to select a range
ecc8e6d
First commit of the target logger
rahuldutta90 9da683c
Merged from preview
rahuldutta90 c030dd1
Merge adls-data-plane to this branch
rahuldutta90 b599d75
Fix the change log and merge issues
rahuldutta90 8cf6bb9
Revert the psd1 file, address other comments of review
rahuldutta90 2686057
Remove unused usings and make the target class internal
rahuldutta90 dd92629
Change changelog
104a29c
Merge adls-data-plane
rahuldutta90 97935bd
Update the ADLS version to include logging details
rahuldutta90 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
22 changes: 22 additions & 0 deletions
22
src/ResourceManager/DataLakeStore/Commands.DataLakeStore/DataPlaneModels/AdlsLoggerTarget.cs
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
using System.Collections.Concurrent; | ||
using NLog; | ||
using NLog.Targets; | ||
|
||
namespace Microsoft.Azure.Commands.DataLakeStore.Models | ||
{ | ||
/// <summary> | ||
/// NLog is used by the ADLS dataplane sdk to log debug messages. We can create a custom target | ||
/// which basically queues the debug data to the ConcurrentQueue for debug messages. | ||
/// https://github.com/NLog/NLog/wiki/How-to-write-a-custom-target | ||
/// </summary> | ||
[Target("AdlsLogger")] | ||
internal sealed class AdlsLoggerTarget : TargetWithLayout | ||
{ | ||
internal ConcurrentQueue<string> DebugMessageQueue; | ||
protected override void Write(LogEventInfo logEvent) | ||
{ | ||
string logMessage = Layout.Render(logEvent); | ||
DebugMessageQueue?.Enqueue(logMessage); | ||
} | ||
} | ||
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2 changes: 1 addition & 1 deletion
2
src/ResourceManager/DataLakeStore/Commands.DataLakeStore/packages.config
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,6 @@ | ||
<?xml version="1.0" encoding="utf-8"?> | ||
<packages> | ||
<package id="Microsoft.Azure.DataLake.Store" version="1.1.2" targetFramework="net452" /> | ||
<package id="Microsoft.Azure.DataLake.Store" version="1.1.4" targetFramework="net452" /> | ||
<package id="Microsoft.Azure.Management.DataLake.Store" version="2.3.0-preview" targetFramework="net452" /> | ||
<package id="NLog" version="4.4.12" targetFramework="net452" /> | ||
</packages> |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you describe what this actually logs? We want to make sure that request method and headers and response status and headers are included
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@markcowl This is the debug line NLog posts:
2018-02-16 09:23:59.1720|DEBUG|adls.dotnet.WebTransport|HTTPRequest,Succeeded,cReqId:cc6758eb-656f-4bd7-b589-67165664e899.0,lat:6496,err,Reqlen:0,Resplen:0,token_ns:2,sReqId:2551abc6-ed04-4a99-b71f-264da768cf7d,path:/NewFile,qp:op=CREATE&overwrite=True&leaseid=30038d32-172f-4054-ac08-359421c24282&filesessionid=30038d32-172f-4054-ac08-359421c24282&CreateParent=True&write=true&syncFlag=DATA&api-version=2017-08-01
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@markcowl For failure case only we have the httpstatus in response like here we have InternalServorError:
2018-02-14 20:02:48.5660|DEBUG|adls.dotnet.WebTransport|HTTPRequest,failed,cReqId:e1dc5c6d-a9c1-4202-8d99-7e6c00d19f8d.0,lat:723,errInternalServerError: RuntimeException,Reqlen:15,Resplen:0,token_ns:0,sReqId:a2ed5f89-f8a7-42e6-af1c-187806135668,path:/Test/dir1/testConcurrentAppendParallel_15,qp:op=CONCURRENTAPPEND&appendMode=autocreate&api-version=2017-08-01
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have been using the SDK for a while in both Java and .Net. For both of these, we have looked at what customers typically need in order to troubleshoot errors, and that is what we log in our logs. This is what customers have been using to troubleshoot large systems, like Hadoop - provides a good balance between readable and analyzable data, and the huge volume of logs that some of our runs can generate (e.g., recursive ACL setting or large data upload). i.e., this is the best balance (so far) we have found between volume of data and usefulness of data to be able to troubleshoot.