A new version Object Mapper for Python has been released. Get it on Pypi via
pip install object-mapper
It supports mapping of “hidden” attributes that can be useful while mapping, e.g., SQLAlchemy object.
You can clone it from GitHub too. Enjoy!
A new version Object Mapper for Python has been released. Get it on Pypi via
pip install object-mapper
It supports mapping of “hidden” attributes that can be useful while mapping, e.g., SQLAlchemy object.
You can clone it from GitHub too. Enjoy!
After a long, long time there is new of the Config Transformation Extension v1.4.0.0. with support for Visual Studio 2017 (finally :)). You can clone it from GitHub too.
Deploying single project within different Azure App Services can be a common task – one service for test, one service for prelife and another one for production. There is no problem with such scenario. But when you are using Webjobs project, it can cause an issue.
Webjobs are using Storage for various purposes. You must specify AzureWebJobsDashboard and AzureWebJobsStorage connection strings in Webjob configuration.
If you are deploying same project with Webjobs within a different App Service, you MUST SET CONNECTION STRINGS TO DIFFERENT STORAGE ACCOUNT. Otherwise these Webjobs will collide – they will access same metadata because of the same path name. E.g., in case of time-triggered WebJob it can happen that only the first WebJob will be executed periodically and the seconds one never.
Note: There is opened GitHub issue that is still open because of low priority.
If you are using your RaspberryPi with display, e.g., like here, you probably want to launch it to the kiosk mode after start or reboot automatically.
The simplest way is to execute a script. In the script you can start Chromium or Epiphany browser, etc. The following code snippet represents a file start_kiosk.sh. It launches Epiphany browser on localhost in full-screen mode.
epiphany-browser -a --profile /home/pi/.config http://localhost --display=:0 & sleep 15s xte "key F11" -x:0
The next step is to setup script execution. It can be done by adding execution script into file /home/pi/.config/lxsession/LXDE-pi/autostart:
@/usr/bin/start_kiosk.sh
By this setup you can change your Raspberry into a simple kiosk device. There exist more sophisticated solution for the real kiosk mode, e.g., disabling access to users, etc.
This simple solution just save manual start of the browser or other app after Raspberry start.
We were requested to incorporate functionality that ensure single user login (access). It means that if another client logs into the application, the first client (already logged in) will be logged out during his next request.
Because in the app we are using JWT token, as described in the previous post, we incorporated this functionality into this approach.
The process workflow is as follows:
IdentityService contains a new method ValidateToken that compares token ids. The result of the comparison is added as a Claim with name ValidTokenId into the TokenValidatedContext. Updated IndetityService looks as follows:
public class IdentityService { public static string ValidTokenId = "ValidTokenId"; private static readonly Random Random = new Random(); private readonly Configuration configuration; public IdentityService(Configuration configuration) { this.configuration = configuration; } public async Task ValidateToken(TokenValidatedContext context) { var claims = new List(); var sub = (context.SecurityToken as JwtSecurityToken)?.Claims .FirstOrDefault(e => e.Type == JwtRegisteredClaimNames.Sub)?.Value; if (sub != null) { Common.Db.Model.User user; using (var dbContext = new DatabaseContext(DatabaseContext.GetDbOptions(this.configuration.DbConnection))) { user = dbContext .Users .AsNoTracking() .FirstOrDefault(e => string.Equals(e.Email, sub, StringComparison.OrdinalIgnoreCase)); } if (user != null) { claims.Add(new Claim(ValidTokenId, user.UserTokenId == context.SecurityToken.Id ? "true" : "false", ClaimValueTypes.Boolean)); var claimsIdentity = context.Ticket.Principal.Identity as ClaimsIdentity; claimsIdentity.AddClaims(claims); } } await Task.CompletedTask; } private static string GetSalt() { var bytes = new byte[128 / 8]; using (var keyGenerator = RandomNumberGenerator.Create()) { keyGenerator.GetBytes(bytes); return BitConverter.ToString(bytes).Replace("-", string.Empty).ToLower(); } } private string GeneratePasswordHash(string password, string salt) { // NOTE: Here you should generate the password hash by your own // algorithm :). } public string GetPasswordHash(string password) { return this.GeneratePasswordHash(password, GetSalt()); } public bool IsPasswordValid(string password, string passwordHash) { var salt= passwordHash; // NOTE: Get salt frompasswordHash var hash = this.GeneratePasswordHash(password, salt); return passwordHash == hash; } public string GetRandomString(int length = 16) { const string Chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789"; return new string(Enumerable.Repeat(Chars, length) .Select(s => s[Random.Next(s.Length)]).ToArray()); } public JwtToken GenerateToken(JwtConfig jwtConfig, Credentials credentials, IEnumerable roles) { var now = DateTime.UtcNow; var tokenId = Guid.NewGuid().ToString(); // Specifically add the jti (random nonce), iat (issued timestamp), and sub (subject/user) claims. // You can add other claims here, if you want: var claims = new List; { new Claim(JwtRegisteredClaimNames.Sub, credentials.Email), new Claim(ClaimTypes.Name, credentials.Email), new Claim(JwtRegisteredClaimNames.Jti, tokenId), new Claim(JwtRegisteredClaimNames.Iat, DateTimeToUnixSeconds(now).ToString(), ClaimValueTypes.Integer64), new Claim(JwtRegisteredClaimNames.Iss, jwtConfig.Issuer) }; foreach (var role in roles) { claims.Add(new Claim(ClaimTypes.Role, role.Name)); } // Create the JWT and write it to a string var jwt = new JwtSecurityToken( jwtConfig.Issuer, jwtConfig.Audience, claims, now, now.Add(jwtConfig.Expiration), jwtConfig.SigningCredentials); var encodedJwt = new JwtSecurityTokenHandler().WriteToken(jwt); var response = new JwtToken { AccessToken = encodedJwt, ExpiresIn = (int)jwtConfig.Expiration.TotalSeconds, TokenId = tokenId }; return response; } private static long DateTimeToUnixSeconds(DateTime date) { return (long)date.Subtract(new DateTime(1970, 1, 1)).TotalSeconds; } }
A JWT configuration must be updated to call IdentityService.ValidateToken. Property JwtBearerOptions.Events must be updated as follows:
public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory, DatabaseContext databaseContext, Configuration configuration) { CompositionRoot.SetProvider(app.ApplicationServices); var jwtConfig = configuration.JwtConfig; var signingKey = new SymmetricSecurityKey(Encoding.ASCII.GetBytes(jwtConfig.SecretKey)); jwtConfig.SigningCredentials = new SigningCredentials(signingKey, SecurityAlgorithms.HmacSha256); jwtConfig.SecretKey = null; var tokenValidationParameters = new TokenValidationParameters { // The signing key must match! ValidateIssuerSigningKey = true, IssuerSigningKey = signingKey, // Validate the JWT Issuer (iss) claim ValidateIssuer = true, ValidIssuer = jwtConfig.Issuer, // Validate the JWT Audience (aud) claim ValidateAudience = true, ValidAudience = jwtConfig.Audience, // Validate the token expiry ValidateLifetime = true, // If you want to allow a certain amount of clock drift, set that here: ClockSkew = TimeSpan.Zero }; var identityService = CompositionRoot.Resolve(); app.UseJwtBearerAuthentication(new JwtBearerOptions { Audience = jwtConfig.Audience, AutomaticAuthenticate = true, AutomaticChallenge = true, TokenValidationParameters = tokenValidationParameters, Events = new JwtBearerEvents { OnTokenValidated = identityService.ValidateToken, OnChallenge = context => { return Task.CompletedTask; } } });
To validate particular claim ValidTokenId, authorize attribute must be extended with this parameter:
[Authorize(“ValidTokenId”)]
There are probably other possibilities how to ensure single user login in app depending on particular requirements. In our case, if another user logs in with the same account as another user did before, after the first user tries to request the app again, the request is revoked with HTTP error 401. This satisfies that only single user can be logged in via an account – the last one (wins).
After a log time, there is another post, now about JWT in .NET WebApi Code. In .NET WebApi Core there are multiple possibilities how provide authentication and they can be selected during creating a new project from template. Even there exists libraries for JWT, there is no template that generates stub with this authentication automatically.
There exist other similar solutions, mainly via using middleware that handles specific route, e.g., api/token that returns particular token:
But I think that middleware is not needed for this issue. In this case, I want to return (generate) a JWT token in two cases:
To provide JWT support, we need to install Microsoft.AspNetCore.Authentication.JwtBearer NuGet package.
First, we create a service for identity management. It contains all we need:
public class IdentityService { private static readonly Random Random = new Random(); private readonly Configuration configuration; public IdentityService(Configuration configuration) { this.configuration = configuration; } private static string GetSalt() { var bytes = new byte[128 / 8]; using (var keyGenerator = RandomNumberGenerator.Create()) { keyGenerator.GetBytes(bytes); return BitConverter.ToString(bytes).Replace("-", string.Empty).ToLower(); } } private string GeneratePasswordHash(string password, string salt) { // NOTE: Here you should generate a password hash. // For some security issue, I will not provide my algorithm :). } public string GetPasswordHash(string password) { return this.GeneratePasswordHash(password, GetSalt()); } public bool IsPasswordValid(string password, string passwordHash) { var salt = passwordHash; // NOTE: Here you must salt from your passwordHash var hash = this.GeneratePasswordHash(password, salt); return passwordHash == hash; } public string GetRandomString(int length = 16) { const string Chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789"; return new string(Enumerable.Repeat(Chars, length) .Select(s => s[Random.Next(s.Length)]).ToArray()); } public JwtToken GenerateToken(JwtConfig jwtConfig, Credentials credentials, IEnumerable<Role> roles) { var now = DateTime.UtcNow; var tokenId = Guid.NewGuid().ToString(); // Specifically add the jti (random nonce), iat (issued timestamp), and sub (subject/user) claims. // You can add other claims here, if you want: var claims = new List<Claim>; { new Claim(JwtRegisteredClaimNames.Sub, credentials.Email), new Claim(ClaimTypes.Name, credentials.Email), new Claim(JwtRegisteredClaimNames.Jti, tokenId), new Claim(JwtRegisteredClaimNames.Iat, DateTimeToUnixSeconds(now).ToString(), ClaimValueTypes.Integer64), new Claim(JwtRegisteredClaimNames.Iss, jwtConfig.Issuer) }; foreach (var role in roles) { claims.Add(new Claim(ClaimTypes.Role, role.Name)); } // Create the JWT and write it to a string var jwt = new JwtSecurityToken( jwtConfig.Issuer, jwtConfig.Audience, claims, now, now.Add(jwtConfig.Expiration), jwtConfig.SigningCredentials); var encodedJwt = new JwtSecurityTokenHandler().WriteToken(jwt); var response = new JwtToken { AccessToken = encodedJwt, ExpiresIn = (int)jwtConfig.Expiration.TotalSeconds, TokenId = tokenId }; return response; } private static long DateTimeToUnixSeconds(DateTime date) { return (long)date.Subtract(new DateTime(1970, 1, 1)).TotalSeconds; } }
Where
public class JwtConfig { public string Path { get; set; } public string Issuer { get; set; } public string Audience { get; set; } public string SecretKey { get; set; } public TimeSpan Expiration { get; set; } = TimeSpan.FromDays(31); public Microsoft.IdentityModel.TokensSigningCredentials SigningCredentials { get; set; } } }
and
public class Credentials { public string Email { get; set; } public string Password { get; set; } } }
The main method GenerateToken takes as arguments JwtConfig which contains data for token generation, e.g., secret key, expiration. Next it takes additional parameters that are added into the token, e.g., user credentials (email) and roles.
To generate token during login, we need method IdentityService.IsPasswordValid. The method validates user credentials and if true, it generates and returns a new token.
// [HttpPost] Commented because of WordPress code pretty print public async Task<IActionResult> PostLogin([FromBody] Login userLogin) { var user = this.userRepository.Get(e => e.Email == userLogin.Email); if (user == null || !this.identityService.IsPasswordValid(userLogin.Password, user.Password)) { return this.Forbid(); } var token = this.identityService.GenerateToken(this.configuration.JwtConfig, new Credentials {Email = user.Email}, user.UserRoles.Select(e => e.Role)); user.UserTokenId = token.TokenId; await this.userRepository.Update(user); return this.Ok(new { Token = token, Profile = new User { Email = user.Email, FirstName = user.FirstName, LastName = user.LastName, CompanyInfo = new CompanyInfo { Name = user.CompanyInfo?.Name } } }); }
Sign up method creates a new user record, next it generates and returns new token.
// [HttpPost] Commented because of WordPress code pretty print public async Task<IActionResult> PostCreate([FromBody] User user) { // First validate Recaptcha if (!await this.ValidateGoogleRecaptha(user.RecaptchaCode)) { return this.BadRequest(this.GetMessage("Invalid Recaptcha")); } var roles = new[] {this.roleRepository.Get(e => e.Name == "User")}; var userFromDb = this.autoMapperService.Mapper.Map<Common.Db.Model.User>(user); userFromDb.Password = this.identityService.GetPasswordHash(userFromDb.Password); try { await this.userRepository.Create(userFromDb, roles); } catch (DbUpdateException dbex) { var inner = dbex.InnerException as SqlException; if (inner != null && inner.Message.Contains("Cannot insert duplicate key row in object 'dbo.Users")) { return this.BadRequest(this.GetMessage("User with this email already exists")); } throw; } var token = this.identityService.GenerateToken(this.configuration.JwtConfig, new Credentials {Email = user.Email}, roles); return this.Created( string.Empty, new { Token = token, Profile = new User { Email = user.Email, FirstName = user.FirstName, LastName = user.LastName, CompanyInfo = new CompanyInfo { Name = user.CompanyInfo?.Name } } }); }
To validate token in each request, the app must be configured in Startup.cs in Configure method.
public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory, DatabaseContext databaseContext, Configuration configuration) { CompositionRoot.SetProvider(app.ApplicationServices); var jwtConfig = configuration.JwtConfig; var signingKey = new SymmetricSecurityKey(Encoding.ASCII.GetBytes(jwtConfig.SecretKey)); jwtConfig.SigningCredentials = new SigningCredentials(signingKey, SecurityAlgorithms.HmacSha256); jwtConfig.SecretKey = null; var tokenValidationParameters = new TokenValidationParameters { // The signing key must match! ValidateIssuerSigningKey = true, IssuerSigningKey = signingKey, // Validate the JWT Issuer (iss) claim ValidateIssuer = true, ValidIssuer = jwtConfig.Issuer, // Validate the JWT Audience (aud) claim ValidateAudience = true, ValidAudience = jwtConfig.Audience, // Validate the token expiry ValidateLifetime = true, // If you want to allow a certain amount of clock drift, set that here: ClockSkew = TimeSpan.Zero }; var identityService = CompositionRoot.Resolve<IdentityService>(); app.UseJwtBearerAuthentication(new JwtBearerOptions { Audience = jwtConfig.Audience, AutomaticAuthenticate = true, AutomaticChallenge = true, TokenValidationParameters = tokenValidationParameters }); }
Next, each route that require authentication must me annotated with [Authorize] attribute as in other authentication methods in ASP.NET.
When client receives generated token in response, he must store it and adds it into every request header that requires authentication. It can differs by particular technology/framework. The important is that it must start with Bearer word.
headers['Authorization'] = `Bearer ${accessToken}`
JWT authentication in .NET Core WebApi is not so complicated at all. There exists NuGet package that provides its validation and only part that must be implemented is token generation and its configuration.
In the previous post I wrote about possible data loss while working with numbers in Azure DocumentDb. Because I was quite surprised by the behavior which started to make sense after I read the specification exhaustively I wanted to test it in MongoDb too. Because I was never issued while using this db.
Compare to Azure DocumentDb, MongoDb uses BSON serialization format that defines different data types. Especially, it defines multiple numeric types – int (int32), long (int64), double and decimal (decimal128). This means that the problem issued in DocumentDb should not occur. So let’s try it.
For the test, I used MongoDb (v 3.4.3) in Docker and RoboMongo for a simple visualization. Next, I created three documents with property FacebookId and stored the value 10208580988747499 as a string, long and double. As you can see in Figure 1, the value was stored correctly as a long. When storing as a double (the value 10208580988747499.0), it was stored as 1.02085809887475e+016.0 that is the same value that stored DocumentDb in my previous test.
Because the issue described in previous post didn’t let me sleep and I was surprised that this never happen to me with different NoSql Db, I wanted to test the similar situation in MongoDb too. Because MongoDb has more rich set of data types, it does not behave as DocumentDb in this case which is caused by use of different data types.
I have been using Azure Table Storage and MongoDb for a few years for various purposes and projects based on their advantages. Then I started to use DocumentDb instead of MongoDb because it is the “official” NoSql (document based) database in Azure and it can be used just as a service.
Basically, I’m storing some geological, climate and statistics data from my IoT devices, web scrapers and public APIs. So a quite different data with various structures and data types. And until now, I had not issue, it was working as I expected, until…
Note: The used FacebookId is a sample, I don’t know if it really exists, it is just used as an example.
Before a few weeks I was asked to update some project using DocumentDb for storing some user data in it. One of the property to be stored in it was a user’s FacebookId. The value returned by the Facebook SDK in C# is of type long. Ok, so lets store it – update the code, update tests. Nothing complex, it works and tests are passing. Done.
Not at all! Suddenly I found there are duplicates in the test database – same users with same FacebookId. What? There are tests for it, what is wrong? After debugging I found that there must be problem with Azure DocumentDB .NET SDK. A property with value 10208580988747499 was stored as 10208580988747500. I found a GitHub issue too. It should be repaired in new version. But I already have newer version. Ok. This SDK uses Newton.JSON library for JSON (de)serialization. It can be the problem, but it wasn’t. A simple test of serialization returns expected value. So where is the problem? When I tried to stored values directly in Azure Portal – first as a string and secondly as a number, I got following result (see Figure 1 and Figure 2):
DocumentDb supports data types Null, Whitespace, Object, Value, Array, String, Boolean and Number in IEEE754 double precision definition. And it is the problem. The value is rounded. So a user with FacebookId 10208580988747499 was never stored and that’s why it was creating duplicate records (and FacebookId used in tests was lower and was not rounded and they were passing. So, If you are using long datatype, use long values in tests!).
For a test you can use a converter on this page on the bottom. Number 10208580988747499 is converted to 1.02085809887475e16 and back, number 1.02085809887475e16 is converted to 10208580988747500. Which is exactly the same value that DocumentDb converted and stored.
So how to solve this problem? Because mentioned Azure DocumentDB .NET SDK uses Newton.JSON internally, the solution is quite straightforward – use a custom JsonConverter attribute that will serialize long value to string (that will be stored) and deserialize it back to long.
public class LongConverter : JsonConverter { public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer) { var val = JToken.FromObject(value.ToString()); val.WriteTo(writer); } public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer) { return reader.TokenType != JsonToken.String ? 0 : long.Parse(reader.Value.ToString()); } public override bool CanConvert(Type objectType) { return true; } }
And the usage:
public class UserProfile : Microsoft.Azure.Documents.Resource { [JsonConverter(typeof(LongConverter))] public long FacebookId { get; set; } ... }
Now, the FacebookId property will be stored in Db as a string without any rounding and converted back to long while getting data from Db.
DocumentDb is a good NoSql Db that is very easy to use. But even you face up some issue. So what I recommend – read some specs, especially about types before you start to use some new Db. Because data is what you store and data has a type. It can save you time while inspecting strange behavior.
Update 15.04.2017: Numeric Data Types In MongoDb (Azure DocumentDb Comparison)
I have been running my private IoT network in a simple setup for more that one year. In the time of writing first script, Azure IoT hub was in beta and it was too complex for simple playing with sensors. The network consists of:
The service itself does some data pre-processing :
RPis are running Raspbian distributions and sensor apps are written in Python because there are powerful libraries for GPIOs management with excellent manuals and tutorials.
Azure IoT Hub is complex and powerful service with ability to consume and produce data to Azure Service Bus, Azure Streaming Analytics Jobs, MS Power BI, etc. With all these services together is it possible to establish a robust micro-service system. So let’s try it.
I found various good tutorial how to connect all this stuff, but what I didn’t find was a simple way how to send data to IoT hub, no SDK or tutorial. But after a little bit of searching, I was successful with some existing solutions hidden over the internet and sum-up all into one toolkit that provides:
It is a simple console application (C#) that helps you to manage your IoT device, because every device has to have a key to be able to connect to IoT Hub.
Just set a connection string that can be found in IoT Hub instance setting in Azure Portal and run it.
I’m planning to rewrite it to Python lib to have ability to auto-setup IoT device itself.
This project contains a library that can send and receive data from IoT hub via HTTP protocol. A sample usage can be found in main.py. It was tested with Python 2.7.x, but with some little updates (encoding stuff) it should work with Python 3.x too.
A source can be found on GitHub. Enjoy!
Before almost a year my friend developing mobile applications asked me to help him with importing some data into Azure DocumentDb because he has never worked with it. When I asked him why he was doing it instead of somebody who was responsible for it, he replied that there was nobody else. So we imported data, wrote a manual. And at this point the fun begun.
A company A wanted a mobile game (with some backed stuff). Because of internal rules, one company made analysis and another company developed the product. So firm B wrote a business analysis. Company C did the backend part and company D did mobile apps. Because there was another company’s A politics, company E was maintaining many backend apps for company A (and it happened that even the company E was taking money for it they were not able to update db).
The game had to be developed for all platforms, even not so common Windows Phone, so external developer F made this version for C. A department of company A (G) was controlling sign process for app stores and required manuals for mobile apps. Which is correct. IOS and Android was published without problems, WP not.
Because of “Works on my machine”, there was a problem with WP app. So why not to go to the company and solve the problem on site? The problem was the developer F. He didn’t want to go. Don’t ask me why, I don’t know. So they asked me if I could try to make it working. I have never worked with Unity and I wrote just two simple WP apps for fun – not so many experiences. It took me a few hours to inspect the code and make it working successfully (e.g., Unity needs full paths that is almost not possible to change and the better solution is to create same folder structure as on original PC where the project created). So I compiled it and sent it to store under my account. Good, it was working. Next I wrote a tutorial and went to G department. We run the app build and it failed. There were missing libraries. After inspection we found the problem. The app was targeted for Visual Studio 2015, but the firm run officially on Visual Studio 2013. But there was no requirement for that. So we changed versions of referenced libs to older versions, built it, signed it and deployed it. Finally with success, but it took us time that could be saved.
After more than half a year I was asked to update the app backend. Ok. Give me specification and code. They sent me the code in a zip file. There was no source control. That was all they had. The app was not able to build because of missing configuration file that was excluded from the source versioning system (it was easy to fix, but it should be mentioned somewhere – there were custom keys for app configuration). When I opened the project, there was no single test. But a lot of TODO comments about what is missing and should be done. No code comment. Next it had a strange code structure – mixture of models and business logic together. Entities from DocumentDb were sent out of the API exposing internal _id and other properties that should not be published to API consumer. Next, there was no official staging environment, simply nothing. …
My question is how is this possible in these days (and that is not all):
So I hope that this is not a common model how a software for companies is done. Especially in these days with hundreds of tutorials, best practices, certifications and audits.